Dawn bindings for NodeJS

Note: This code is currently WIP. There are a number of known issues.

Building

System requirements

Install depot_tools

Dawn uses the Chromium build system and dependency management so you need to install depot_tools and add it to the PATH.

Fetch dependencies

First, the steps are similar to docs/building.md, but instead of the Get the code step, run:

# Clone the repo as "dawn"
git clone https://dawn.googlesource.com/dawn dawn && cd dawn

# Bootstrap the NodeJS binding gclient configuration
cp scripts/standalone-with-node.gclient .gclient

# Fetch external dependencies and toolchains with gclient
gclient sync

Optionally, on Linux install X11-xcb support:

sudo apt-get install libx11-xcb-dev

If you don't have those supporting libraries, then you must use the -DDAWN_USE_X11=OFF flag on CMake (see below).

Build

Currently, the node bindings can only be built with CMake:

mkdir <build-output-path>
cd <build-output-path>
cmake <dawn-root-path> -GNinja -DDAWN_BUILD_NODE_BINDINGS=1
ninja dawn.node

On Windows, the steps are similar:

mkdir <build-output-path>
cd <build-output-path>
cmake <dawn-root-path> -DDAWN_BUILD_NODE_BINDINGS=1
cmake --build . --target dawn_node

Running WebGPU CTS

  1. Build the dawn.node NodeJS module.
  2. Checkout the WebGPU CTS repo or use the one in third_party/webgpu-cts.
  3. Run npm install from inside the CTS directory to install its dependencies.

Now you can run CTS using our ./tools/run shell script. On Windows, it's recommended to use MSYS2 (e.g. Git Bash):

./tools/run run-cts --bin=<path-build-dir> [WebGPU CTS query]

Where <path-build-dir> is the output directory.
Note: <path-build-dir> can be omitted if your build directory sits at <dawn>/out/active, which is enforced if you use <dawn>/tools/setup-build (recommended).

Or if you checked out your own CTS repo:

./tools/run run-cts --bin=<path-build-dir> --cts=<path-to-cts> [WebGPU CTS query]

If this fails with the error message TypeError: expander is not a function or its return value is not iterable, try appending --build=false to the start of the run-cts command line flags.

To test against SwiftShader (software implementation of Vulkan) instead of the default Vulkan device, prefix ./tools/run run-cts with VK_ICD_FILENAMES=<swiftshader-cmake-build>/Linux/vk_swiftshader_icd.json. For example:

VK_ICD_FILENAMES=<swiftshader-cmake-build>/Linux/vk_swiftshader_icd.json ./tools/run run-cts --bin=<path-build-dir> [WebGPU CTS query]

To test against Lavapipe (mesa's software implementation of Vulkan), similarly to SwiftShader, prefix ./tools/run run-cts with VK_ICD_FILENAMES=<lavapipe-install-dir>/share/vulkan/icd.d/lvp_icd.x86_64.json. For example:

VK_ICD_FILENAMES=<lavapipe-install-dir>/share/vulkan/icd.d/lvp_icd.x86_64.json ./tools/run run-cts --bin=<path-build-dir> [WebGPU CTS query]

The --flag parameter must be passed in multiple times, once for each flag begin set. Here are some common arguments:

  • backend=<null|webgpu|d3d11|d3d12|metal|vulkan|opengl|opengles>
  • adapter=<name-of-adapter> - specifies the adapter to use. May be a substring of the full adapter name. Pass an invalid adapter name and --verbose to see all possible adapters.
  • dlldir=<path> - used to add an extra DLL search path on Windows, primarily to load the right d3dcompiler_47.dll
  • enable-dawn-features=<features> - enable Dawn toggles, e.g. dump_shaders
  • disable-dawn-features=<features> - disable Dawn toggles

For example, on Windows, to use the d3dcompiler_47.dll from a Chromium checkout, and to dump shader output, we could run the following using Git Bash:

./tools/run run-cts --verbose --bin=/c/src/dawn/out/active --cts=/c/src/webgpu-cts --flag=dlldir="C:\src\chromium\src\out\Release" --flag=enable-dawn-features=dump_shaders 'webgpu:shader,execution,builtin,abs:integer_builtin_functions,abs_unsigned:storageClass="storage";storageMode="read_write";containerType="vector";isAtomic=false;baseType="u32";type="vec2%3Cu32%3E"'

Note that we pass --verbose above so that all test output, including the dumped shader, is written to stdout.

Testing against a run-cts expectations file

You can write out an expectations file with the --output <path> command line flag, and then compare this snapshot to a later run with --expect <path>.

Viewing Dawn per-test coverage

Requirements:

Dawn needs to be built with clang and the DAWN_EMIT_COVERAGE CMake flag.

LLVM is also required, either built from source, or downloaded as part of an LLVM release. Make sure that the subdirectory llvm/bin is in your PATH, and that llvm-cov and llvm-profdata binaries are present.

Optionally, the LLVM_SOURCE_DIR CMake flag can also be specified to point the the ./llvm directory of an LLVM checkout, which will build turbo-cov and dramatically speed up the processing of coverage data. If turbo-cov is not built, llvm-cov will be used instead.

It may be helpful to write a bash script like use.sh that sets up your build environment, for example:

#!/bin/bash
LLVM_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}")" >/dev/null 2>&1 && pwd )"
export CC=${LLVM_DIR}/bin/clang
export CXX=${LLVM_DIR}/bin/clang++
export MSAN_SYMBOLIZER_PATH=${LLVM_DIR}/bin/llvm-symbolizer
export PATH=${LLVM_DIR}/bin:${PATH}

Place this script in the LLVM root directory, then you can setup your build for coverage as follows:

. ~/bin/llvm-15/use.sh
cmake <dawn-root-path> -GNinja -DDAWN_BUILD_NODE_BINDINGS=1 -DDAWN_EMIT_COVERAGE=1
ninja dawn.node

Note that if you already generated the CMake build environment with a different compiler, you will need to delete CMakeCache.txt and generate again.

Usage

Run ./tools/run run-cts like before, but include the --coverage flag. After running the tests, your browser will open with a coverage viewer.

Click a source file in the left hand panel, then click a green span in the file source to see the tests that exercised that code.

You can also highlight multiple lines to view all the tests that covered any of that highlighted source.

NOTE: if the left hand panel is empty, ensure that dawn.node was built with the same version of Clang that matches the version of LLVM being used to retrieve coverage info.

Debugging TypeScript with VSCode

Open or create the .vscode/launch.json file, and add:

{
  "version": "0.2.0",
  "configurations": [
    {
      "name": "Debug with node",
      "type": "node",
      "request": "launch",
      "outFiles": ["./**/*.js"],
      "args": [
        "-e",
        "require('./src/common/tools/setup-ts-in-node.js');require('./src/common/runtime/cmdline.ts');",
        "--",
        "placeholder-arg",
        "--gpu-provider",
        "[path-to-cts.js]", // REPLACE: [path-to-cts.js]
        "[test-query]" // REPLACE: [test-query]
      ],
      "cwd": "[cts-root]" // REPLACE: [cts-root]
    }
  ]
}

Replacing:

  • [cts-root] with the path to the CTS root directory. If you are editing the .vscode/launch.json from within the CTS workspace, then you may use ${workspaceFolder}.
  • [cts.js] this is the path to the cts.js file that should be copied to the output directory by the build step
  • test-query with the test query string. Example: webgpu:shader,execution,builtin,abs:*

Debugging dawn-node issues in gdb/lldb

It is possible to run the CTS with dawn-node directly similarly to Debugging TypeScript with VSCode:

cd <cts-root-dir>
[path-to-node] \ # for example <dawn-root-dir>/third_party/node/<arch>/node
    -e "require('./src/common/tools/setup-ts-in-node.js');require('./src/common/runtime/cmdline.ts');" \
    -- \
    placeholder-arg \
    --gpu-provider [path to cts.js] \
    [test-query]

This command is then possible to run in your debugger of choice.

Recipes for building software GPUs

Building Lavapipe (LLVM Vulkan)

System requirements

  • Python 3.6 or newer
  • Meson
  • llvm-dev

These can be pre-built versions from apt-get, etc.

Get source code

You can either download a specific version of mesa from here

or use git to pull from the source tree (details)

git clone https://gitlab.freedesktop.org/mesa/mesa.git

Building

In the source directory

mkdir <build-dir>
meson setup <build-dir>/ -Dprefix=<lavapipe-install-dir> -Dvulkan-drivers=swrast
meson compile -C <build-dir>
meson install -C <build-dir>

This should result in Lavapipe being built and the artifacts copied to <lavapipe-install-dir>

Further details can be found here

Known issues

See https://bugs.chromium.org/p/dawn/issues/list?q=component%3ADawnNode&can=2 for tracked bugs, and TODOs in the code.

Remaining work