Setup

Nix

Psyche can use nix + flakes as a build system, to make your life easier. To install nix, simply run curl --proto '=https' --tlsv1.2 -sSf -L https://install.determinate.systems/nix | sh -s -- install or find it at your local package manager.

MacOS / aarch64

These platforms aren't supported right now :( PRs welcome!

Direnv

  1. Install direnv
  2. direnv allow

Non-direnv

nix develop to enter a development shell.

Ubuntu

The following instructions are needed for a server with a fresh Ubuntu installation

  1. Install drivers
sudo apt update sudo apt install -y ubuntu-drivers-common sudo ubuntu-drivers install
  1. Install CUDA libraries
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb sudo dpkg -i cuda-keyring_1.1-1_all.deb sudo apt-get update sudo apt-get -y install cuda-toolkit-12-4 rm cuda-keyring_1.1-1_all.deb sudo apt-get install libnccl-dev libnccl2 sudo apt install nvidia-cuda-toolkit
  1. Download libtorch & extract
wget https://download.pytorch.org/libtorch/cu124/libtorch-cxx11-abi-shared-with-deps-2.6.0%2Bcu124.zip unzip libtorch-cxx11-abi-shared-with-deps-2.6.0+cu124.zip rm libtorch-cxx11-abi-shared-with-deps-2.6.0+cu124.zip
  1. In the .bashrc file, set the following libtorch environment variables. Here <path_to_libtorch> is the absolute path to the extracted libtorch folder from the previous step
export LIBTORCH=<path_to_libtorch> export LIBTORCH_INCLUDE=<path_to_libtorch> export LIBTORCH_LIB=<path_to_libtorch> export LD_LIBRARY_PATH=<path_to_libtorch>/lib:$LD_LIBRARY_PATH export CUDA_ROOT=/usr/local/cuda-12.4

This can also be achieved by making a .cargo/config.toml file in the checkout path

[env] LIBTORCH=<path_to_libtorch> LD_LIBRARY_PATH=<path_to_libtorch>/lib CUDA_ROOT = "/usr/local/cuda-12.4"
  1. Download & install Rust
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
  1. (optional) Install just
sudo snap install just --edge --classic
  1. (optional) Install Solana and Anchor

Install Solana

sh -c "$(curl -sSfL https://release.anza.xyz/beta/install)"

After installation, follow the instructions to add the Solana tools to PATH.

Install Anchor

cargo install --git https://github.com/coral-xyz/anchor --rev a7a23eea308440a9fa9cb79cee7bddd30ab163d5 anchor-cli

This may require

sudo apt install pkg-config libudev-dev libssl-dev

Windows

  1. Install CUDA libraries: https://developer.nvidia.com/cuda-12-4-1-download-archive?target_os=Windows&target_arch=x86_64&target_version=11

  2. Download libtorch & extract: https://download.pytorch.org/libtorch/cu124/libtorch-cxx11-abi-shared-with-deps-2.6.0%2Bcu124.zip

  3. Download OpenSSL: https://slproweb.com/download/Win64OpenSSL-3_3_2.exe

  4. Install Perl: https://github.com/StrawberryPerl/Perl-Dist-Strawberry/releases/download/SP_53822_64bit/strawberry-perl-5.38.2.2-64bit.msi

  5. Create a .cargo/config.toml file to set environment variables

NOTE: Building may take several minutes the first time as openssl-sys takes a long time (for some reason)

[env] LIBTORCH = <path_to_libtorch> OPENSSL_LIB_DIR = <path_to_openssl>/lib/VC/x64/MT OPENSSL_INCLUDE_DIR = <path_to_openssl>/include

Docker

Create a Docker image with the necessary dependencies to run a Psyche client:

  1. Install the necessary NVIDIA and CUDA drivers as explained in the previous sections.
  2. Install the NVIDIA container toolkit. If using Ubuntu, just run:
sudo apt-get update sudo apt-get install -y nvidia-container-toolkit
  1. Create an .env file following the .env.example in psyche/config/client and update the necessary environment variables.
  2. Run docker compose build.

Useful commands

Psyche uses just to run some common tasks!

You can run $ just to see the whole list of commands!

Running checks

requires Nix!

$ just check

If it passes, CI will pass.

Formatting

$ just fmt