nanobind
is a tool and a library for implementing native C/C++ extensions for python.
You can implement such extensions in many ways, but nanobind
makes life a bit more
easier, especially if you write C++ and use CMake to build your native code.
nanobind
recommends using the scikit-build-core
as the build backend; it handles the
heavy lifting of building stuff through CMake during python package build process. It even
supports using Ninja. And as a cherry on top, it is able to download CMake and Ninja
from pypi
if they are not available locally.
Setup using nanobind+scikit-build-core
can produce pre-built wheels that can be distributed to users
similar to pure python modules - user installs the wheel and then imports stuff the usual way; the fact
that the underlying functionality was implemented in C/C++ is more or less transparent. The only difficulty
is that C/C++ extension packages are platform-specific, whereas pure python packages are usually universal.
- [1] https://nanobind.readthedocs.io/en/latest/index.html
- [2] https://scikit-build-core.readthedocs.io/en/latest/
Anyways. There are some example projects that describe how to setup a project with nanobind
:
- [3] https://nanobind.readthedocs.io/en/latest/building.html
- [4] https://nanobind.readthedocs.io/en/latest/api_cmake.html
- [5] https://github.com/wjakob/nanobind_example
Host tooling:
- C++ compiler
python3
,python3-venv
, andpython3-dev
or similar- (optional)
cmake
andninja
The setup consists of three parts:
- Setup virtual env
- Writing
pyproject.toml
- Writing the extension stub (.cpp)
- Writing toplevel
CMakeLists.txt
- Writing a dummy python package
$ python3 -m venv venv
$ source venv/bin/activate
$ pip install nanobind 'scikit-build-core[pyproject]'
# optionally also:
$ pip install cmake ninja
Note that this is only necessary for local testing and development; when distributing the final package, these
dependencies are automatically installed during the build process (see below the build-system.requires
in pyproject.toml
)
at minimum you'll need
[build-system]
requires = ["scikit-build-core >=0.4.3", "nanobind >=1.3.2"]
build-backend = "scikit_build_core.build"
[project]
name = "supermodule"
version = "0.0.1"
The contents here are not really important; this note only focuses on the project setup itself. You can use the example code, from [5] for example:
// put this into my_ext.cpp
#include <nanobind/nanobind.h>
namespace nb = nanobind;
using namespace nb::literals;
NB_MODULE(my_ext, m) {
m.doc() = "This is a \"hello world\" example with nanobind";
m.def("add", [](int a, int b) { return a + b; }, "a"_a, "b"_a);
m.attr("the_answer") = 42;
}
You'll need the usual CMake boilerplate (cmake_minimum_required(...); project(...)
)
Full example: https://github.com/wjakob/nanobind_example/blob/master/CMakeLists.txt
Then you'll need to add
# Try to import all Python components potentially needed by nanobind
# Adjust version to match your required minimum python version
find_package(Python 3.8
REQUIRED COMPONENTS Interpreter Development.Module
OPTIONAL_COMPONENTS Development.SABIModule)
# Import nanobind through CMake's find_package mechanism
find_package(nanobind CONFIG REQUIRED)
nanobind_add_module(my_ext my_ext.cpp)
# link other libraries the usual way:
# target_link_libraries(my_ext PRIVATE some_other_lib)
# Install directive for scikit-build-core
install(TARGETS my_ext LIBRARY DESTINATION supermodule)
Note that the first find_package()
usually should find the required components from your host system's environment,
while the latter nanobind
specific CMake module is provided by the nanobind
python package, which now resides in
our virtual environent (venv/
).
In our virtual environment, the CMake module is provided in venv/lib/python3.12/site-packages/nanobind/cmake/nanobind-config.cmake
(for example). Note that if you attempt to configure the project as-is, it will likely fail since that module is not in the CMakes module search path (CMAKE_PREFIX_PATH
):
$ cmake -B out -S . -G Ninja
CMake Error at CMakeLists.txt:20 (find_package):
Could not find a package configuration file provided by "nanobind" with any
of the following names:
nanobindConfig.cmake
nanobind-config.cmake
...
You can overcome this problem with:
$ cmake -B out -S . -G Ninja -DCMAKE_PREFIX_PATH=$PWD/venv/lib/python3.12/site-packages/nanobind/cmake
However this is annoying and clearly specific to the path of the current venv and python version.
Some documentation suggest using this instead:
execute_process(
COMMAND "${Python_EXECUTABLE}" -m nanobind --cmake_dir
OUTPUT_STRIP_TRAILING_WHITESPACE OUTPUT_VARIABLE NB_DIR)
list(APPEND CMAKE_PREFIX_PATH "${NB_DIR}")
find_package(nanobind CONFIG REQUIRED)
which does the same thing, but dynamically at configure time. The magic sauce is provided by python3 -m nanobind --cmake_dir
which queries for the required path in the current environment, and in our venv it should print
$ python3 -m nanobind --cmake_dir
<project-specific-prefix>/venv/lib/python3.12/site-packages/nanobind/cmake
just as we want, excellent.
Notes:
- the
execute_process()
trick does not seem to be necessary when building the package outside the venv, for example when runningpython3 -m build
- I guess the build backend (scikit-build-core
) extends theCMAKE_PREFIX_PATH
dynamically for us. But this stuff is good to know anyways. - You don't need to shoehorn everything into the
my_ext
module; you can keep it as simple as possible while simply linking to existing CMake-enabled libraries in other project(s). You can import existing projects into the build withfind_package()
if they have proper CMake config modules, or even withadd_subdirectory()
if the project is a local submodule.
$ mkdir -p src/supermodule
then in src/supermodule/__init__.py
write:
from .my_ext import add, the_answer
__all__ = [
'add',
'the_answer',
]
Things to note:
- The
supermodule
is a dummy module for wrapping the native extensionmy_ext
- the
my_ext
is a native shared library object (.so
) that is build during the package build process; it does not exist yet. - Th
install()
directive in theCMakeLists.txt
places themy_ext
library into oursupermodule
directory during build process. - The import module name (
src/supermodule
) does not need to match the name of the project (supermodule
) in thepyproject.toml
: having the same name for both is just a convention.
Now, if everything works, you should be able to do something like this:
$ python3 -m build -w
$ python3 -m venv venv-new
$ ./venv-new/bin/pip install dist/supermodule-0.0.1-cp312-cp312-macosx_13_0_arm64.whl
$ ./venv-new/bin/python3 -c 'import supermodule; print(supermodule.add1(supermodule.the_answer, 1))'
43
note that the paths and .whl
name changes based on system and python version. You can list all files that were provided in the wheel:
$ find venv-new/ -type f -ipath "*supermodule*":
it should list at least flies <prefix>/supermodule/my-ext-something.so
and <prefix>/supermodule/__init__.py
.
NOTE: This is strictly a demonstration of pre-building the package/wheel. You shouldn't need to do this during development, as it becomes annoying rather quickly. You can simply run pip install .
to rebuild and install the package automatically. Also, see below note about "editable rebuild"; it should make life much much easier.
Create git repo, add the files created here. Do NOT add the venv
directories or any built extensions into your repo. Keep the repo strictly sources only.
If you publish the package in Github (for example), you should be able to install it directly like this:
$ python3 -m venv venv-temp
$ ./venv-temp/bin/pip install git+https://github.com/<your-username>/<your-repo>
And if everything is working correctly, the module should build and install without errors. Obviously this requires at least a C++ compiler on the host where you run this command. However, if you distribute pre-built wheels (.whl, see above), users do not need to have anything besides pip
.
Once the nanobind
-enabled package is setup, you do not need to interact with the C/C++ compiler, CMake, or Ninja directly. You should be able to accomplish project setup related tasks with just plain pip
commands. The build backend (scikit-build-core
) and nanobind
should handle everything for you automatically, so you can focus on writing C++ and Python code.
I am not yet sure how the native extension provides typing/annotation support; there was some mention of mypy
in nanobind
documentation. To be continued...
Documentation suggests setting up local development environment with
$ python3 -m venv venv
$ source venv/bin/activate
$ pip install nanobind 'scikit-build-core[pyproject]'
$ pip install --no-build-isolation -ve . -Ceditable.rebuild=true
Now, if whenever you open the REPL and run import supermodule
, the native extension is automatically rebuilt through CMake:
$ python3
>>> import supermodule
Running cmake --build & --install in <pwd>/build/cp312-cp312-macosx_13_0_arm64
ninja: no work to do.
-- Install configuration: "Release"
-- Up-to-date: <pwd>/venv/lib/python3.12/site-packages/supermodule/my_ext.cpython-312-darwin.so
>>>
Neat, isn't it? Although the documentation suggest this is a bit hacky, and might have some issues. I don't know, I have not observed any.. yet.
Note:
- the
-Ceditable.rebuild=true
(as in-C/--config-settings
) is a directive toscikit-build-core
, see [2] for more choices - in the editable setup, the module is split in two locations:
>>> supermodule.__file__ '<pwd>/src/supermodule/__init__.py' >>> supermodule.my_ext.__file__ '<pwd>/venv/lib/python3.12/site-packages/supermodule/my_ext.cpython-312-darwin.so'
- the auto-rebuild is not REPL-specific, and should work when importing the module in any other python script. Useful when doing test driven development