Yugabyte is best built inside of a docker container for the correct runtime environment mimicking the one the actual release is built with itself.
I found a suitable build environment by looking at the yb's CI scripts: https://github.com/yugabyte/yugabyte-db/blob/master/.github/workflows/build.yml.
Check out the source code.
Make sure build was clean (not initialized in local environment), otherwise move or delete build
dir
From inside the source directory, start a build shell like this:
docker run --rm -ti -v "$PWD":/opt/yb-build/yugabyte-db yugabyteci/yb_build_infra_centos7:v2021-03-26T05_02_29
Go to the build directory:
cd /opt/yb-build/yugabyte-db
In the shell, run
./yb_build.sh release --download-thirdparty --ninja --cmake-only
to setup the build.
Then you can use ninja
commands to build submodules of yb:
cd /opt/yb-build/yugabyte-db/build/release-gcc-dynamic-ninja
ninja yb_util
# or
ninja rocksdb
The built artifacts can be found in the bin
and lib
directories. If you built only a submodule you can hot patch it on the server by just copying the .so
file for a
submodule that you just built. (As an additional check, compare the sizes of the original and the new .so
file).
The mapping between source directories and target modules might not be obvious. To get at least a list of supported output
modules you can use ninja -t targets
. To test whether you got the right module you can add an error to a source file to see
if the target build fails (i.e. contains the source file in question).
You can use
./yb_build.sh release --download-thirdparty --ninja
to build the full release but that will take a long time (took ~30min for me).
To build a release tarball, make sure patchelf is installed in the container (
yum install patchelf
as root) and then./yb_release --build release --skip_build --build_root build/release-clang11-dynamic-ninja