Vcpkg: A build-script management tool with a low entry barrier
Vcpkg certainly has gained a lot of traction in the C++ community. Next to conan it is one of the most frequently used dependency and build-script management tools. Nevertheless, when initially investigating this topic for a larger enterprise application I worked on at the time, I jumped onto the conan bandwagon. I had the opportunity to briefly talk to with Diego Gonzalez, one of its creators at Meeting C++ 2019 in Berlin. This was also around the time when JFrog started to financially back conan. These facts and the technical aspects have been reason enough to choose conan over its alternatives. Since then a few years have gone by and I wanted to check the current state of vcpkg. Today it’s got a vibrant community and has expanded it’s feature set significantly.
Introduction
The initial setup is quite straight forward. Cloning the official vcpkg repository and running the bootstrap command is all that’s required to get started.
git clone https://github.com/microsoft/vcpkg.git
$VCPKG_REPO/vcpkg/bootstrap-vcpkg.sh -disableMetrics
A couple of details one might not immediately notice is that we actually were instructed to clone the vcpkg repository which contains the official collection of build scripts, the so called ports. The actual vcpkg utility is called vcpkg-tool and downloaded as a binary when running the bootstrap script. This is something I noticed quite a few times with vpckg, it often downloads precompiled binaries from different web resources. It always checks the SHA512 hash of the file to ensure the files haven’t been tampered with, but in an enterprise environment one might still not be in favor of depending on random external resources.
One of the things vcpkg does very well is reducing the effort it requires to get everything up and running. The target platform is specified using a triplet file which are already part of the vcpkg repository for the most common scenarios. The same is true for the ports of the most common open source libraries. As an example compiling opencv4 only requires a single command:
$VCPKG_REPO/vcpkg install opencv4:x64-linux
Running this command will build the debug and release version of opencv4 and install them in the $VCPKG_REPO/installed
directory. Only the name of the package and the triplet name have to be specified. When building the libraries for the first time vcpkg will on Windows download some additional build tools (e.g. nasm) from Github and msys2.org, whereas on Linux it requests the user to install them from the distributions package manager. In both cases opencv4 and all its dependencies install without any issues. In some additional tests I also was able to install qt5 with the same simple command. The resulting binaries can be integrated into a cmake based project with ease. The only requirement is to point cmake to the vcpkg scripts by setting a toolchain file.
cd /path/to/your/cmake/project
mkdir build && cd build
cmake -DCAME_BUILD_TYPE=Release -DCMAKE_TOOLCHAIN_FILE=$VCPKG_REPO/scripts/buildsystems/vcpkg.cmake ..
The cmake script of the consumer project can continue to use the regular find_package calls we’re all familiar with. Meaning no modifications are necessary.
Creating a Triplet File
While starting of is quite simple, in most projects the default platform settings might not be what’s required. This isn’t any different with the triplets within the vcpkg repository. The default one for Windows doesn’t specify a toolset (Visual Studio) version and the default triplet for Linux by default compiles static libraries instead of dynamic ones like on Windows. Fortunately it is relatively simple to create custom ones. Creating a file with the triplet name and putting it in a new directory is all that’s required.
set(VCPKG_TARGET_ARCHITECTURE x64)
set(VCPKG_CRT_LINKAGE dynamic)
set(VCPKG_LIBRARY_LINKAGE dynamic)
set(VCPKG_CMAKE_SYSTEM_NAME Linux)
set(VCPKG_TARGET_ARCHITECTURE x64)
set(VCPKG_CRT_LINKAGE dynamic)
set(VCPKG_LIBRARY_LINKAGE dynamic)
set(VCPKG_PLATFORM_TOOLSET v142)
The path to this directory is then specified using the --overlay-triplets
argument when calling vcpkg.
$VCPKG_PATH/vcpkg --overlay-triplets=$VCPKG_OVERLAYS/triplets install libxml2:x64-linux-dynamic
When consuming a package with a custom triplet one additionally has to pass -DVCPKG_TARGET_TRIPLET=x64-linux-dynamic
to cmake in order to ensure the correct install directory is selected. The two triplet files above worked without issues in my tests when compiling libxml2 as a dynamic library. Unfortunately this was not true when compiling qt5. It failed with some dynamic linking issues within it’s dependencies. Doing some research then pointed me to a few entries in the Github issue tracker like Issue 117722 and Issue 9847. Basically dynamic compilation has been an issue multiple times with qt5 and in general doesn’t seem to be well supported. Which is very unfortunate since some licenses require the compilation of some projects as dynamic libraries. Qt is a famous example for this because they are available under LGPL and a commercial license at extremely high cost. The latter obviously often not being an option and the first requiring dynamic compilation. Even more surprising since the official vcpkg documentation explicitly lists qt5 as an example for dynamic compilation.
Creating a Port
While one likely get’s far with the default ports, often there is one or the other obscure library which isn’t available yet. As an example I was looking into packaging my own libazul. At first we have to create a ports directory which is than passed to vcpkg using the --overlay-ports
argument. Another subdirectory, using the library name then contains a metadata file vcpkg.json
and a build-script portfile.cmake
.
# vcpkg.json
{
"name": "libazul",
"version-string": "0.0.1",
"description": "a c++ framework",
"dependencies": [
{
"name": "vcpkg-cmake",
"host": true
},
{
"name": "vcpkg-cmake-config",
"host": true
}
]
}
The metadata file contains the name, description, version and package dependencies. In our example we add two dependencies which we require for additional cmake functionality within the build script.
# portfile.cmake
vcpkg_from_github(
OUT_SOURCE_PATH SOURCE_PATH
REPO MichaEiler/libazul
REF 4a18c5d9d71722b09e04ab5d837d5f012ae4e8b0
SHA512 f4a7a564a61ceabbf91de10dec0980b65183e89d0972799c0b796a09320da9c0a08cd304b26a6c32243cdb50243a8cdba4b9e63f9f78d71b8bd47450b940cdee
HEAD_REF master
)
file(COPY "${CMAKE_CURRENT_LIST_DIR}/CMakeLists.txt" DESTINATION "${SOURCE_PATH}")
file(COPY "${CMAKE_CURRENT_LIST_DIR}/azulConfig.cmake" DESTINATION "${SOURCE_PATH}")
vcpkg_cmake_configure(
SOURCE_PATH ${SOURCE_PATH}
OPTIONS
-DLIBAZUL_WITH_TESTS=OFF
-DBUILD_SHARED_LIBS=OFF
)
vcpkg_cmake_install()
vcpkg_cmake_config_fixup(PACKAGE_NAME "azul" CONFIG_PATH "lib/cmake/azul")
vcpkg_fixup_pkgconfig()
file(INSTALL ${SOURCE_PATH}/LICENSE DESTINATION ${CURRENT_PACKAGES_DIR}/share/libazul RENAME copyright)
file(REMOVE_RECURSE "${CURRENT_PACKAGES_DIR}/debug/include")
The build script starts with cloning the source code repository and copies over the patched cmake files I included below. The configure step then prepares the build process and specifies all the build parameters. Meaning in contrast to the options concept in conan, compilation parameters are all hard-coded in case of vcpkg-ports. The subsequent vcpkg_cmake function compiles the library and installs it in the target directory. The config fixup functionality applies some fixes to the cmake target files which are required for the find_package feature to work when consuming the library. Again in contrast to conan this is not something vcpkg generates but we have to ensure a working cmake-config file is created during package compilation. I added the last two lines in the patched CMakeLists.txt below to achieve this. The azulConfig.cmake then just includes the resulting targets file.
cmake_minimum_required(VERSION 3.20)
project (azul C CXX)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_VISIBILITY_PRESET hidden)
set(CMAKE_VISIBILITY_INLINES_HIDDEN ON)
set(CMAKE_POSITION_INDEPENDENT_CODE ON)
option(LIBAZUL_WITH_IPC "Enable the build of the IPC component. (Not available on iOS and Android)" ON)
option(LIBAZUL_WITH_TESTS "Enable the compilation of all unit test projects. (Not available on iOS and Android)" ON)
include(${CMAKE_SOURCE_DIR}/cmake/common.cmake)
include(${CMAKE_SOURCE_DIR}/cmake/platform.cmake)
set_global_compiler_options()
add_subdirectory(${CMAKE_SOURCE_DIR}/src/utils)
add_subdirectory(${CMAKE_SOURCE_DIR}/src/async)
add_subdirectory(${CMAKE_SOURCE_DIR}/src/compute)
if (LIBAZUL_WITH_IPC)
add_subdirectory(${CMAKE_SOURCE_DIR}/src/ipc/)
endif()
if (LIBAZUL_WITH_TESTS AND NOT BUILD_SHARED_LIBS)
add_subdirectory(${CMAKE_SOURCE_DIR}/3rdparty/googletest)
add_subdirectory(${CMAKE_SOURCE_DIR}/tests/async/)
add_subdirectory(${CMAKE_SOURCE_DIR}/tests/compute/)
add_subdirectory(${CMAKE_SOURCE_DIR}/tests/utils)
if (LIBAZUL_WITH_IPC)
add_subdirectory(${CMAKE_SOURCE_DIR}/tests/ipc/)
endif()
endif()
export(TARGETS azul_async azul_compute azul_ipc azul_utils NAMESPACE azul:: FILE ${CMAKE_INSTALL_PREFIX}/lib/cmake/azul/azulTargets.cmake)
install(FILES ${CMAKE_SOURCE_DIR}/azulConfig.cmake DESTINATION ${CMAKE_INSTALL_PREFIX}/lib/cmake/azul/)
if(NOT TARGET azul)
include("${CMAKE_CURRENT_LIST_DIR}/azulTargets.cmake")
endif()
Overall this seems to work just fine but it doesn’t help that the vcpkg documentation doesn’t seem to be very extensive. The easiest way to get to a result is to look up how other ports have solved the build and install process. A major disappointment has also been the hard-coded defines. Meaning one has to create a custom port as soon as one has to change a single compilation argument.
Impressions & Initial Conclusion
One of it’s biggest appeals seems to be the low entry barrier. Starting of with a new project is incredibly simple. Once one gets deeper into the details vcpkg like most other tools shows its limitations too. Compared to conan it seems to be less flexible, has a smaller feature set and a worse documentation. Paired with the dynamic linking issues and limited multi-platform support, this looks to me like conan still is the better option for larger projects. I also prefer the ability to write build scripts in python rather than cmake.