2025-02-06 10:07.45: New job: test ocaml-multicore/picos https://github.com/ocaml-multicore/picos.git#refs/heads/add-queue (269016ed9530e0bb707ec5c14deb905f0db1d4f9) (openbsd-amd64:openbsd-76-amd64-5.3_opam-2.3) Base: openbsd-76-amd64-ocaml-5.3 Opam project build To reproduce locally: git clone --recursive "https://github.com/ocaml-multicore/picos.git" -b "add-queue" && cd "picos" && git reset --hard 269016ed cat > Dockerfile <<'END-OF-DOCKERFILE' FROM openbsd-76-amd64-ocaml-5.3 # openbsd-76-amd64-5.3_opam-2.3 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" RUN doas ln -f /usr/bin/opam-2.3 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version RUN cd ~/opam-repository && (git cat-file -e a3f03aaceb14fa3a2f69a8d4f7c3cb97d896ce34 || git fetch origin master) && git reset -q --hard a3f03aaceb14fa3a2f69a8d4f7c3cb97d896ce34 && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 picos_std.opam picos_mux.opam picos_meta.opam picos_lwt.opam picos_io_cohttp.opam picos_io.opam picos_aux.opam picos.opam /home/opam/src/./ RUN opam pin add -yn picos_std.dev '/home/opam/src/./' && \ opam pin add -yn picos_mux.dev '/home/opam/src/./' && \ opam pin add -yn picos_meta.dev '/home/opam/src/./' && \ opam pin add -yn picos_lwt.dev '/home/opam/src/./' && \ opam pin add -yn picos_io_cohttp.dev '/home/opam/src/./' && \ opam pin add -yn picos_io.dev '/home/opam/src/./' && \ opam pin add -yn picos_aux.dev '/home/opam/src/./' && \ opam pin add -yn picos.dev '/home/opam/src/./' RUN echo '(lang dune 3.0)' > '/home/opam/src/./dune-project' ENV DEPS="alcotest.1.8.0 angstrom.0.16.1 asn1-combinators.0.3.2 astring.0.8.5 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-bytes.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base base64.3.5.1 bigstringaf.0.10.0 bos.0.2.1 ca-certs.1.0.0 camlp-streams.5.0.1 cmdliner.1.3.0 cohttp.6.0.0 cohttp-lwt.6.0.0 cohttp-lwt-unix.6.0.0 conduit.7.1.0 conduit-lwt.7.1.0 conduit-lwt-unix.7.1.0 conf-gmp.4 conf-gmp-powm-sec.3 conf-npm.1 conf-pkg-config.4 containers.3.15 cppo.1.8.0 csexp.1.5.2 digestif.1.2.0 domain-local-await.1.0.1 domain-name.0.4.0 domain_shims.0.1.0 dscheck.0.5.0 dune.3.17.2 dune-configurator.3.17.2 duration.0.2.1 either.1.0.0 eqaf.0.10 fmt.0.9.0 fpath.0.7.3 gen.1.1 gmap.0.3.0 http.6.0.0 ipaddr.5.6.0 ipaddr-sexp.5.6.0 js_of_ocaml.5.9.1 js_of_ocaml-compiler.5.9.1 kdf.1.0.0 logs.0.7.0 lwt.5.9.0 macaddr.5.6.0 magic-mime.1.3.1 mdx.2.5.0 menhir.20240715 menhirCST.20240715 menhirLib.20240715 menhirSdk.20240715 mirage-crypto.2.0.0 mirage-crypto-ec.2.0.0 mirage-crypto-pk.2.0.0 mirage-crypto-rng.2.0.0 mtime.2.1.0 multicore-bench.0.1.7 multicore-magic.2.3.1 multicore-magic-dscheck.2.3.1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.3.7.3 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.15.0 ocamlfind.1.9.8 ocplib-endian.1.2 ohex.0.2.0 oseq.0.5.1 ppx_derivers.1.2.1 ppx_sexp_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 psq.0.2.1 ptime.1.2.0 qcheck-core.0.23 qcheck-multicoretests-util.0.7 qcheck-stm.0.7 re.1.12.0 result.1.5 rresult.0.7.0 sedlex.3.3 seq.base sexplib0.v0.17.0 stdlib-shims.0.3.0 stringext.1.6.0 thread-local-storage.0.2 thread-table.1.0.0 topkg.1.0.7 tsort.2.1.0 uri.4.4.0 uri-sexp.4.4.0 uutf.1.0.3 x509.1.0.5 yojson.2.2.2 zarith.1.14" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.3 --depext-only -y picos_std.dev picos_mux.dev picos_meta.dev picos_lwt.dev picos_io_cohttp.dev picos_io.dev picos_aux.dev picos.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /home/opam/src RUN cd /home/opam/src && opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-02-06 10:07.45: Using cache hint "ocaml-multicore/picos-openbsd-76-amd64-ocaml-5.3-openbsd-76-amd64-5.3_opam-2.3-c28f039ead9a0d71536e13b6ebdf14d4" 2025-02-06 10:07.45: Using OBuilder spec: ((from openbsd-76-amd64-ocaml-5.3) (comment openbsd-76-amd64-5.3_opam-2.3) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (run (shell "doas ln -f /usr/bin/opam-2.3 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e a3f03aaceb14fa3a2f69a8d4f7c3cb97d896ce34 || git fetch origin master) && git reset -q --hard a3f03aaceb14fa3a2f69a8d4f7c3cb97d896ce34 && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src picos_std.opam picos_mux.opam picos_meta.opam picos_lwt.opam picos_io_cohttp.opam picos_io.opam picos_aux.opam picos.opam) (dst /home/opam/src/./)) (run (network host) (shell "opam pin add -yn picos_std.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_mux.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_meta.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_lwt.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_io_cohttp.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_io.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_aux.dev '/home/opam/src/./' && \ \nopam pin add -yn picos.dev '/home/opam/src/./'")) (run (network host) (shell "echo '(lang dune 3.0)' > '/home/opam/src/./dune-project'")) (env DEPS "alcotest.1.8.0 angstrom.0.16.1 asn1-combinators.0.3.2 astring.0.8.5 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-bytes.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base base64.3.5.1 bigstringaf.0.10.0 bos.0.2.1 ca-certs.1.0.0 camlp-streams.5.0.1 cmdliner.1.3.0 cohttp.6.0.0 cohttp-lwt.6.0.0 cohttp-lwt-unix.6.0.0 conduit.7.1.0 conduit-lwt.7.1.0 conduit-lwt-unix.7.1.0 conf-gmp.4 conf-gmp-powm-sec.3 conf-npm.1 conf-pkg-config.4 containers.3.15 cppo.1.8.0 csexp.1.5.2 digestif.1.2.0 domain-local-await.1.0.1 domain-name.0.4.0 domain_shims.0.1.0 dscheck.0.5.0 dune.3.17.2 dune-configurator.3.17.2 duration.0.2.1 either.1.0.0 eqaf.0.10 fmt.0.9.0 fpath.0.7.3 gen.1.1 gmap.0.3.0 http.6.0.0 ipaddr.5.6.0 ipaddr-sexp.5.6.0 js_of_ocaml.5.9.1 js_of_ocaml-compiler.5.9.1 kdf.1.0.0 logs.0.7.0 lwt.5.9.0 macaddr.5.6.0 magic-mime.1.3.1 mdx.2.5.0 menhir.20240715 menhirCST.20240715 menhirLib.20240715 menhirSdk.20240715 mirage-crypto.2.0.0 mirage-crypto-ec.2.0.0 mirage-crypto-pk.2.0.0 mirage-crypto-rng.2.0.0 mtime.2.1.0 multicore-bench.0.1.7 multicore-magic.2.3.1 multicore-magic-dscheck.2.3.1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.3.7.3 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.15.0 ocamlfind.1.9.8 ocplib-endian.1.2 ohex.0.2.0 oseq.0.5.1 ppx_derivers.1.2.1 ppx_sexp_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 psq.0.2.1 ptime.1.2.0 qcheck-core.0.23 qcheck-multicoretests-util.0.7 qcheck-stm.0.7 re.1.12.0 result.1.5 rresult.0.7.0 sedlex.3.3 seq.base sexplib0.v0.17.0 stdlib-shims.0.3.0 stringext.1.6.0 thread-local-storage.0.2 thread-table.1.0.0 topkg.1.0.7 tsort.2.1.0 uri.4.4.0 uri-sexp.4.4.0 uutf.1.0.3 x509.1.0.5 yojson.2.2.2 zarith.1.14") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y picos_std.dev picos_mux.dev picos_meta.dev picos_lwt.dev picos_io_cohttp.dev picos_io.dev picos_aux.dev picos.dev $DEPS")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /home/opam/src)) (run (shell "cd /home/opam/src && opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-02-06 10:07.45: Waiting for resource in pool OCluster 2025-02-06 10:07.45: Waiting for worker… 2025-02-06 13:17.50: Got resource from pool OCluster Building on bremusa All commits already cached HEAD is now at 269016e Add `Picos_std_sync.Queue` (from openbsd-76-amd64-ocaml-5.3) 2025-02-06 13:17.50 ---> using "753f998be4709b35e38a93b7272dee818f697255d5b48fadd1c23cb92a82f244" from cache /: (comment openbsd-76-amd64-5.3_opam-2.3) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (run (shell "doas ln -f /usr/bin/opam-2.3 /usr/bin/opam")) 2025-02-06 13:17.50 ---> using "b8621e22ceef6e45a2b70edd0350b6ec56c4cb2588ea75f8c24bf97f9bb8e3c1" from cache /: (run (shell "opam init --reinit -ni")) No configuration file found, using built-in defaults. Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [default] synchronised from file:///home/opam/opam-repository 2025-02-06 13:17.50 ---> using "caa4033b0f51d691107e536d025405bc96e59134e192afef776a8bfd43aac5e1" from cache /: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) OpenBSD 7.6 The OCaml toplevel, version 5.3.0 2.3.0 2025-02-06 13:17.50 ---> using "7116693f695a70fbcc11fcbf90bc7fd32f30a6c20aabff3b1e1c926faaad9b01" from cache /: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e a3f03aaceb14fa3a2f69a8d4f7c3cb97d896ce34 || git fetch origin master) && git reset -q --hard a3f03aaceb14fa3a2f69a8d4f7c3cb97d896ce34 && git log --no-decorate -n1 --oneline && opam update -u")) From https://github.com/ocaml/opam-repository * branch master -> FETCH_HEAD f93eb7cb02..eab2328a9d master -> origin/master a3f03aaceb Merge pull request #27384 from hannesm/release-mirage-crypto-v2.0.0 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [default] synchronised from file:///home/opam/opam-repository Everything as up-to-date as possible (run with --verbose to show unavailable upgrades). However, you may "opam upgrade" these packages explicitly, which will ask permission to downgrade or uninstall the conflicting packages. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-02-06 13:17.50 ---> using "6938105b4028ef2275b20e90187e0ced273cd3f10aada66c4d43af98123e1d1c" from cache /: (copy (src picos_std.opam picos_mux.opam picos_meta.opam picos_lwt.opam picos_io_cohttp.opam picos_io.opam picos_aux.opam picos.opam) (dst /home/opam/src/./)) 2025-02-06 13:17.50 ---> using "b533cf253718ee14e4e39d96ab15284da0b735ef2c6e50b6a365af00ca4fca0f" from cache /: (run (network host) (shell "opam pin add -yn picos_std.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_mux.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_meta.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_lwt.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_io_cohttp.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_io.dev '/home/opam/src/./' && \ \nopam pin add -yn picos_aux.dev '/home/opam/src/./' && \ \nopam pin add -yn picos.dev '/home/opam/src/./'")) [picos_std.dev] synchronised (file:///home/opam/src) picos_std is now pinned to file:///home/opam/src (version dev) [picos_mux.dev] synchronised (file:///home/opam/src) picos_mux is now pinned to file:///home/opam/src (version dev) [picos_meta.dev] synchronised (file:///home/opam/src) picos_meta is now pinned to file:///home/opam/src (version dev) [picos_lwt.dev] synchronised (file:///home/opam/src) picos_lwt is now pinned to file:///home/opam/src (version dev) [picos_io_cohttp.dev] synchronised (file:///home/opam/src) picos_io_cohttp is now pinned to file:///home/opam/src (version dev) [picos_io.dev] synchronised (file:///home/opam/src) picos_io is now pinned to file:///home/opam/src (version dev) [picos_aux.dev] synchronised (file:///home/opam/src) picos_aux is now pinned to file:///home/opam/src (version dev) [picos.dev] synchronised (file:///home/opam/src) picos is now pinned to file:///home/opam/src (version dev) 2025-02-06 13:17.50 ---> using "19a81d832442979d4c4a7b7e46b18f2d7b654aab3e7c0c5a8669e28ef162ecf1" from cache /: (run (network host) (shell "echo '(lang dune 3.0)' > '/home/opam/src/./dune-project'")) 2025-02-06 13:17.50 ---> using "3c8b7c86a13e66652554cd2784c876843649cc817463c1eb2294d8a22a59c5ef" from cache /: (env DEPS "alcotest.1.8.0 angstrom.0.16.1 asn1-combinators.0.3.2 astring.0.8.5 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-bytes.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base base64.3.5.1 bigstringaf.0.10.0 bos.0.2.1 ca-certs.1.0.0 camlp-streams.5.0.1 cmdliner.1.3.0 cohttp.6.0.0 cohttp-lwt.6.0.0 cohttp-lwt-unix.6.0.0 conduit.7.1.0 conduit-lwt.7.1.0 conduit-lwt-unix.7.1.0 conf-gmp.4 conf-gmp-powm-sec.3 conf-npm.1 conf-pkg-config.4 containers.3.15 cppo.1.8.0 csexp.1.5.2 digestif.1.2.0 domain-local-await.1.0.1 domain-name.0.4.0 domain_shims.0.1.0 dscheck.0.5.0 dune.3.17.2 dune-configurator.3.17.2 duration.0.2.1 either.1.0.0 eqaf.0.10 fmt.0.9.0 fpath.0.7.3 gen.1.1 gmap.0.3.0 http.6.0.0 ipaddr.5.6.0 ipaddr-sexp.5.6.0 js_of_ocaml.5.9.1 js_of_ocaml-compiler.5.9.1 kdf.1.0.0 logs.0.7.0 lwt.5.9.0 macaddr.5.6.0 magic-mime.1.3.1 mdx.2.5.0 menhir.20240715 menhirCST.20240715 menhirLib.20240715 menhirSdk.20240715 mirage-crypto.2.0.0 mirage-crypto-ec.2.0.0 mirage-crypto-pk.2.0.0 mirage-crypto-rng.2.0.0 mtime.2.1.0 multicore-bench.0.1.7 multicore-magic.2.3.1 multicore-magic-dscheck.2.3.1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.3.7.3 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.15.0 ocamlfind.1.9.8 ocplib-endian.1.2 ohex.0.2.0 oseq.0.5.1 ppx_derivers.1.2.1 ppx_sexp_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 psq.0.2.1 ptime.1.2.0 qcheck-core.0.23 qcheck-multicoretests-util.0.7 qcheck-stm.0.7 re.1.12.0 result.1.5 rresult.0.7.0 sedlex.3.3 seq.base sexplib0.v0.17.0 stdlib-shims.0.3.0 stringext.1.6.0 thread-local-storage.0.2 thread-table.1.0.0 topkg.1.0.7 tsort.2.1.0 uri.4.4.0 uri-sexp.4.4.0 uutf.1.0.3 x509.1.0.5 yojson.2.2.2 zarith.1.14") /: (env CI true) /: (env OCAMLCI true) /: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y picos_std.dev picos_mux.dev picos_meta.dev picos_lwt.dev picos_io_cohttp.dev picos_io.dev picos_aux.dev picos.dev $DEPS")) [WARNING] Unknown update command for bsd, skipping system update <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [picos.dev] synchronised (file:///home/opam/src) [picos_io.dev] synchronised (file:///home/opam/src) [picos_aux.dev] synchronised (file:///home/opam/src) [picos_io_cohttp.dev] synchronised (file:///home/opam/src) [picos_lwt.dev] synchronised (file:///home/opam/src) [picos_meta.dev] synchronised (file:///home/opam/src) [picos_mux.dev] synchronised (file:///home/opam/src) [picos_std.dev] synchronised (file:///home/opam/src) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following system packages will first need to be installed: gmp node <><> Handling external dependencies <><><><><><><><><><><><><><><><><><><><><><> + /usr/bin/doas "pkg_add" "-I" "gmp" "node" - quirks-7.50 signed on 2025-02-05T10:08:36Z - --- +node-20.18.2v0 ------------------- - You may wish to add /usr/local/lib/node_modules/npm/man to /etc/man.conf - --- +openssl-3.2.3v0 ------------------- - You may wish to add /usr/local/lib/eopenssl32/man to /etc/man.conf 2025-02-06 13:17.50 ---> using "fedb82b5c52e22d84bb203e4a4ce35329684173dce28aff445c9b97b32eed82d" from cache /: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following actions will be performed: === install 99 packages - install alcotest 1.8.0 - install angstrom 0.16.1 - install asn1-combinators 0.3.2 - install astring 0.8.5 - install backoff 0.1.1 - install base v0.17.1 - install base-bytes base - install base64 3.5.1 - install bigstringaf 0.10.0 - install bos 0.2.1 - install ca-certs 1.0.0 - install camlp-streams 5.0.1 - install cmdliner 1.3.0 - install cohttp 6.0.0 - install cohttp-lwt 6.0.0 - install cohttp-lwt-unix 6.0.0 - install conduit 7.1.0 - install conduit-lwt 7.1.0 - install conduit-lwt-unix 7.1.0 - install conf-gmp 4 - install conf-gmp-powm-sec 3 - install conf-npm 1 - install conf-pkg-config 4 - install containers 3.15 - install cppo 1.8.0 - install csexp 1.5.2 - install digestif 1.2.0 - install domain-local-await 1.0.1 - install domain-name 0.4.0 - install domain_shims 0.1.0 - install dscheck 0.5.0 - install dune 3.17.2 - install dune-configurator 3.17.2 - install duration 0.2.1 - install either 1.0.0 - install eqaf 0.10 - install fmt 0.9.0 - install fpath 0.7.3 - install gen 1.1 - install gmap 0.3.0 - install http 6.0.0 - install ipaddr 5.6.0 - install ipaddr-sexp 5.6.0 - install js_of_ocaml 5.9.1 - install js_of_ocaml-compiler 5.9.1 - install kdf 1.0.0 - install logs 0.7.0 - install lwt 5.9.0 - install macaddr 5.6.0 - install magic-mime 1.3.1 - install mdx 2.5.0 - install menhir 20240715 - install menhirCST 20240715 - install menhirLib 20240715 - install menhirSdk 20240715 - install mirage-crypto 2.0.0 - install mirage-crypto-ec 2.0.0 - install mirage-crypto-pk 2.0.0 - install mirage-crypto-rng 2.0.0 - install mtime 2.1.0 - install multicore-bench 0.1.7 - install multicore-magic 2.3.1 - install multicore-magic-dscheck 2.3.1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml-version 3.7.3 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.15.0 - install ocamlfind 1.9.8 - install ocplib-endian 1.2 - install ohex 0.2.0 - install oseq 0.5.1 - install ppx_derivers 1.2.1 - install ppx_sexp_conv v0.17.0 - install ppxlib 0.35.0 - install ppxlib_jane v0.17.2 - install psq 0.2.1 - install ptime 1.2.0 - install qcheck-core 0.23 - install qcheck-multicoretests-util 0.7 - install qcheck-stm 0.7 - install re 1.12.0 - install result 1.5 - install rresult 0.7.0 - install sedlex 3.3 - install seq base - install sexplib0 v0.17.0 - install stdlib-shims 0.3.0 - install stringext 1.6.0 - install thread-local-storage 0.2 - install thread-table 1.0.0 - install topkg 1.0.7 - install tsort 2.1.0 - install uri 4.4.0 - install uri-sexp 4.4.0 - install uutf 1.0.3 - install x509 1.0.5 - install yojson 2.2.2 - install zarith 1.14 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved asn1-combinators.0.3.2 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved astring.0.8.5 (cached) -> retrieved backoff.0.1.1 (cached) -> retrieved alcotest.1.8.0 (cached) -> retrieved base64.3.5.1 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved base.v0.17.1 (cached) -> retrieved ca-certs.1.0.0 (cached) -> retrieved camlp-streams.5.0.1 (cached) -> retrieved bos.0.2.1 (cached) -> retrieved cmdliner.1.3.0 (cached) -> retrieved conf-gmp.4 (cached) -> retrieved conf-gmp-powm-sec.3 (cached) -> retrieved conduit.7.1.0, conduit-lwt.7.1.0, conduit-lwt-unix.7.1.0 (cached) -> retrieved containers.3.15 (cached) -> retrieved cppo.1.8.0 (cached) -> retrieved csexp.1.5.2 (cached) -> retrieved cohttp.6.0.0, cohttp-lwt.6.0.0, cohttp-lwt-unix.6.0.0, http.6.0.0 (cached) -> retrieved domain-local-await.1.0.1 (cached) -> retrieved domain-name.0.4.0 (cached) -> retrieved digestif.1.2.0 (cached) -> retrieved dscheck.0.5.0 (cached) -> retrieved domain_shims.0.1.0 (cached) -> retrieved duration.0.2.1 (cached) -> retrieved either.1.0.0 (cached) -> retrieved eqaf.0.10 (cached) -> retrieved fmt.0.9.0 (cached) -> retrieved fpath.0.7.3 (cached) -> retrieved gen.1.1 (cached) -> retrieved gmap.0.3.0 (cached) -> retrieved ipaddr.5.6.0, ipaddr-sexp.5.6.0, macaddr.5.6.0 (cached) -> retrieved kdf.1.0.0 (cached) -> retrieved logs.0.7.0 (cached) -> retrieved lwt.5.9.0 (cached) -> retrieved magic-mime.1.3.1 (cached) -> retrieved mdx.2.5.0 (cached) -> retrieved menhir.20240715, menhirCST.20240715, menhirLib.20240715, menhirSdk.20240715 (cached) -> retrieved js_of_ocaml.5.9.1, js_of_ocaml-compiler.5.9.1 (cached) -> retrieved mirage-crypto.2.0.0, mirage-crypto-ec.2.0.0, mirage-crypto-pk.2.0.0, mirage-crypto-rng.2.0.0 (cached) -> retrieved mtime.2.1.0 (cached) -> retrieved multicore-bench.0.1.7 (cached) -> retrieved multicore-magic.2.3.1, multicore-magic-dscheck.2.3.1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml-version.3.7.3 (cached) -> retrieved dune.3.17.2, dune-configurator.3.17.2 (cached) -> retrieved ocamlbuild.0.15.0 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved ocplib-endian.1.2 (cached) -> retrieved ohex.0.2.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved oseq.0.5.1 (cached) -> retrieved ppx_sexp_conv.v0.17.0 (cached) -> retrieved ppxlib_jane.v0.17.2 (cached) -> retrieved psq.0.2.1 (cached) -> retrieved qcheck-core.0.23 (cached) -> retrieved ptime.1.2.0 (cached) -> retrieved ppxlib.0.35.0 (cached) -> retrieved re.1.12.0 (cached) -> retrieved result.1.5 (cached) -> installed cmdliner.1.3.0 -> installed conf-gmp.4 -> retrieved sedlex.3.3 (cached) -> retrieved seq.base (cached) -> retrieved rresult.0.7.0 (cached) -> retrieved sexplib0.v0.17.0 (cached) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved stringext.1.6.0 (cached) -> retrieved thread-local-storage.0.2 (cached) -> retrieved thread-table.1.0.0 (cached) -> retrieved qcheck-multicoretests-util.0.7, qcheck-stm.0.7 (cached) -> retrieved tsort.2.1.0 (cached) -> retrieved topkg.1.0.7 (cached) -> retrieved uutf.1.0.3 (cached) -> retrieved uri.4.4.0, uri-sexp.4.4.0 (cached) -> retrieved x509.1.0.5 (cached) -> retrieved zarith.1.14 (cached) -> retrieved yojson.2.2.2 (cached) -> installed conf-npm.1 -> installed conf-gmp-powm-sec.3 -> installed conf-pkg-config.4 -> installed dune.3.17.2 -> installed ocamlbuild.0.15.0 -> installed backoff.0.1.1 -> installed base64.3.5.1 -> installed camlp-streams.5.0.1 -> installed cppo.1.8.0 -> installed csexp.1.5.2 -> installed domain-name.0.4.0 -> installed domain_shims.0.1.0 -> installed dune-configurator.3.17.2 -> installed duration.0.2.1 -> installed bigstringaf.0.10.0 -> installed either.1.0.0 -> installed eqaf.0.10 -> installed containers.3.15 -> installed digestif.1.2.0 -> installed gmap.0.3.0 -> installed http.6.0.0 -> installed macaddr.5.6.0 -> installed magic-mime.1.3.1 -> installed ipaddr.5.6.0 -> installed menhirCST.20240715 -> installed menhirLib.20240715 -> installed menhirSdk.20240715 -> installed mirage-crypto.2.0.0 -> installed menhir.20240715 -> installed kdf.1.0.0 -> installed multicore-magic.2.3.1 -> installed ocaml-compiler-libs.v0.17.0 -> installed ocaml-syntax-shims.1.0.0 -> installed ocaml-version.3.7.3 -> installed angstrom.0.16.1 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed ocamlfind.1.9.8 -> installed base-bytes.base -> installed ohex.0.2.0 -> installed oseq.0.5.1 -> installed ocplib-endian.1.2 -> installed ppx_derivers.1.2.1 -> installed lwt.5.9.0 -> installed qcheck-core.0.23 -> installed result.1.5 -> installed qcheck-multicoretests-util.0.7 -> installed seq.base -> installed sexplib0.v0.17.0 -> installed gen.1.1 -> installed base.v0.17.1 -> installed psq.0.2.1 -> installed qcheck-stm.0.7 -> installed re.1.12.0 -> installed stdlib-shims.0.3.0 -> installed stringext.1.6.0 -> installed ppxlib.0.35.0 -> installed thread-local-storage.0.2 -> installed ppxlib_jane.v0.17.2 -> installed sedlex.3.3 -> installed ppx_sexp_conv.v0.17.0 -> installed thread-table.1.0.0 -> installed ipaddr-sexp.5.6.0 -> installed domain-local-await.1.0.1 -> installed topkg.1.0.7 -> installed tsort.2.1.0 -> installed astring.0.8.5 -> installed dscheck.0.5.0 -> installed fmt.0.9.0 -> installed fpath.0.7.3 -> installed mtime.2.1.0 -> installed multicore-magic-dscheck.2.3.1 -> installed ptime.1.2.0 -> installed rresult.0.7.0 -> installed asn1-combinators.0.3.2 -> installed uri.4.4.0 -> installed uutf.1.0.3 -> installed uri-sexp.4.4.0 -> installed alcotest.1.8.0 -> installed yojson.2.2.2 -> installed zarith.1.14 -> installed js_of_ocaml-compiler.5.9.1 -> installed multicore-bench.0.1.7 -> installed js_of_ocaml.5.9.1 -> installed logs.0.7.0 -> installed bos.0.2.1 -> installed cohttp.6.0.0 -> installed conduit.7.1.0 -> installed cohttp-lwt.6.0.0 -> installed conduit-lwt.7.1.0 -> installed mdx.2.5.0 -> installed mirage-crypto-rng.2.0.0 -> installed mirage-crypto-ec.2.0.0 -> installed mirage-crypto-pk.2.0.0 -> installed x509.1.0.5 -> installed ca-certs.1.0.0 -> installed conduit-lwt-unix.7.1.0 -> installed cohttp-lwt-unix.6.0.0 Done. # To update the current shell environment, run: eval $(opam env) 2025-02-06 13:45.31 ---> saved as "9c34a8f89cb1637d793c6a809a9fd5533c221cb55f395e9f8244d55a328148a2" /: (copy (src .) (dst /home/opam/src)) 2025-02-06 13:46.14 ---> saved as "ef823d2f249e1d1cb732e73c30e0a30731452f0209c36dbe67b9dd1d277ac6d5" /: (run (shell "cd /home/opam/src && opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test && ./test_mpmcq_dscheck.exe) Testing `Picos_mpmcq DSCheck'. This run has ID `XPS2HHJR'. [OK] Multiple pushes and pops 0 Full test results in `~/src/_build/default/test/_build/_tests/Picos_mpmcq DSCheck'. Test Successful in 2.330s. 1 test run. (cd _build/default/test && ./test_picos_dscheck.exe) Testing `Picos DSCheck'. This run has ID `4DTYAGMM'. [OK] Trigger 0 basic contract. [OK] Computation 0 basic contract. [OK] Computation 1 removes triggers. Full test results in `~/src/_build/default/test/_build/_tests/Picos DSCheck'. Test Successful in 0.905s. 3 tests run. (cd _build/default/test && /usr/local/bin/node test_js_of_ocaml.bc.js) Hello, from js_of_ocaml with Picos! (cd _build/default/test && ./test_htbl.exe) random seed: 1283046024126154779 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Htbl sequential [ ] 0 0 0 0 / 32 0.0s Htbl sequential (generating) [✓] 32 0 0 32 / 32 0.0s Htbl sequential [ ] 0 0 0 0 / 32 0.0s Htbl parallel [ ] 2 0 0 2 / 32 0.3s Htbl parallel [ ] 3 0 0 3 / 32 0.8s Htbl parallel [ ] 4 0 0 4 / 32 0.9s Htbl parallel [ ] 5 0 0 5 / 32 1.2s Htbl parallel [ ] 6 0 0 6 / 32 1.4s Htbl parallel [ ] 7 0 0 7 / 32 1.5s Htbl parallel [ ] 8 0 0 8 / 32 1.8s Htbl parallel [ ] 9 0 0 9 / 32 2.0s Htbl parallel [ ] 10 0 0 10 / 32 2.4s Htbl parallel [ ] 12 0 0 12 / 32 2.6s Htbl parallel [ ] 13 0 0 13 / 32 2.9s Htbl parallel [ ] 14 0 0 14 / 32 3.1s Htbl parallel [ ] 16 0 0 16 / 32 3.2s Htbl parallel [ ] 17 0 0 17 / 32 3.9s Htbl parallel [ ] 18 0 0 18 / 32 4.0s Htbl parallel [ ] 19 0 0 19 / 32 4.4s Htbl parallel [ ] 20 0 0 20 / 32 4.8s Htbl parallel [ ] 21 0 0 21 / 32 4.9s Htbl parallel [ ] 22 0 0 22 / 32 5.2s Htbl parallel [ ] 23 0 0 23 / 32 5.5s Htbl parallel [ ] 24 0 0 24 / 32 5.7s Htbl parallel [ ] 25 0 0 25 / 32 5.9s Htbl parallel [ ] 26 0 0 26 / 32 6.4s Htbl parallel [ ] 27 0 0 27 / 32 6.7s Htbl parallel [ ] 28 0 0 28 / 32 7.2s Htbl parallel [ ] 29 0 0 29 / 32 7.4s Htbl parallel [ ] 30 0 0 30 / 32 7.7s Htbl parallel [ ] 31 0 0 31 / 32 8.2s Htbl parallel [✓] 32 0 0 32 / 32 8.3s Htbl parallel ================================================================================ success (ran 2 tests) random seed: 2940638429284106872 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Htbl sequential [✓] 64 0 0 64 / 64 0.0s Htbl sequential [ ] 0 0 0 0 / 64 0.0s Htbl parallel [ ] 1 0 0 1 / 64 0.3s Htbl parallel [ ] 4 0 0 4 / 64 0.4s Htbl parallel [ ] 6 0 0 6 / 64 0.7s Htbl parallel [ ] 7 0 0 7 / 64 0.9s Htbl parallel [ ] 8 0 0 8 / 64 1.1s Htbl parallel [ ] 9 0 0 9 / 64 1.2s Htbl parallel [ ] 10 0 0 10 / 64 1.8s Htbl parallel [ ] 11 0 0 11 / 64 1.9s Htbl parallel [ ] 12 0 0 12 / 64 2.2s Htbl parallel [ ] 13 0 0 13 / 64 2.5s Htbl parallel [ ] 14 0 0 14 / 64 2.7s Htbl parallel [ ] 16 0 0 16 / 64 2.9s Htbl parallel [ ] 17 0 0 17 / 64 3.0s Htbl parallel [ ] 19 0 0 19 / 64 3.2s Htbl parallel [ ] 21 0 0 21 / 64 3.4s Htbl parallel [ ] 22 0 0 22 / 64 3.7s Htbl parallel [ ] 23 0 0 23 / 64 4.2s Htbl parallel [ ] 24 0 0 24 / 64 4.4s Htbl parallel [ ] 26 0 0 26 / 64 4.6s Htbl parallel [ ] 27 0 0 27 / 64 4.7s Htbl parallel [ ] 29 0 0 29 / 64 5.0s Htbl parallel [ ] 30 0 0 30 / 64 5.2s Htbl parallel [ ] 33 0 0 33 / 64 5.4s Htbl parallel [ ] 34 0 0 34 / 64 5.7s Htbl parallel [ ] 35 0 0 35 / 64 5.9s Htbl parallel [ ] 38 0 0 38 / 64 6.0s Htbl parallel [ ] 40 0 0 40 / 64 6.4s Htbl parallel [ ] 42 0 0 42 / 64 6.7s Htbl parallel [ ] 43 0 0 43 / 64 6.8s Htbl parallel [ ] 44 0 0 44 / 64 7.4s Htbl parallel [ ] 45 0 0 45 / 64 7.6s Htbl parallel [ ] 46 0 0 46 / 64 7.8s Htbl parallel [ ] 47 0 0 47 / 64 8.3s Htbl parallel [ ] 48 0 0 48 / 64 8.4s Htbl parallel [ ] 50 0 0 50 / 64 8.6s Htbl parallel [ ] 51 0 0 51 / 64 8.7s Htbl parallel [ ] 52 0 0 52 / 64 8.8s Htbl parallel [ ] 53 0 0 53 / 64 9.0s Htbl parallel [ ] 55 0 0 55 / 64 9.1s Htbl parallel [ ] 57 0 0 57 / 64 9.2s Htbl parallel [ ] 58 0 0 58 / 64 9.4s Htbl parallel [ ] 59 0 0 59 / 64 9.6s Htbl parallel [ ] 61 0 0 61 / 64 9.9s Htbl parallel [ ] 63 0 0 63 / 64 10.1s Htbl parallel [✓] 64 0 0 64 / 64 10.1s Htbl parallel ================================================================================ success (ran 2 tests) random seed: 1906030807505487442 generated error fail pass / total time test name [ ] 0 0 0 0 / 128 0.0s Htbl sequential [✓] 128 0 0 128 / 128 0.0s Htbl sequential [ ] 0 0 0 0 / 128 0.0s Htbl parallel [ ] 2 0 0 2 / 128 0.4s Htbl parallel [ ] 3 0 0 3 / 128 0.5s Htbl parallel [ ] 4 0 0 4 / 128 0.7s Htbl parallel [ ] 6 0 0 6 / 128 1.1s Htbl parallel [ ] 7 0 0 7 / 128 1.3s Htbl parallel [ ] 8 0 0 8 / 128 1.6s Htbl parallel [ ] 9 0 0 9 / 128 1.8s Htbl parallel [ ] 11 0 0 11 / 128 2.0s Htbl parallel [ ] 12 0 0 12 / 128 2.3s Htbl parallel [ ] 13 0 0 13 / 128 2.4s Htbl parallel [ ] 15 0 0 15 / 128 2.6s Htbl parallel [ ] 18 0 0 18 / 128 2.8s Htbl parallel [ ] 20 0 0 20 / 128 3.0s Htbl parallel [ ] 23 0 0 23 / 128 3.1s Htbl parallel [ ] 26 0 0 26 / 128 3.4s Htbl parallel [ ] 28 0 0 28 / 128 3.7s Htbl parallel [ ] 29 0 0 29 / 128 4.1s Htbl parallel [ ] 31 0 0 31 / 128 4.4s Htbl parallel [ ] 32 0 0 32 / 128 4.6s Htbl parallel [ ] 33 0 0 33 / 128 5.5s Htbl parallel [ ] 35 0 0 35 / 128 5.8s Htbl parallel [ ] 36 0 0 36 / 128 5.9s Htbl parallel [ ] 37 0 0 37 / 128 6.2s Htbl parallel [ ] 38 0 0 38 / 128 6.3s Htbl parallel [ ] 40 0 0 40 / 128 6.6s Htbl parallel [ ] 41 0 0 41 / 128 6.7s Htbl parallel [ ] 42 0 0 42 / 128 6.8s Htbl parallel [ ] 44 0 0 44 / 128 7.9s Htbl parallel [ ] 45 0 0 45 / 128 8.2s Htbl parallel [ ] 46 0 0 46 / 128 8.4s Htbl parallel [ ] 47 0 0 47 / 128 8.9s Htbl parallel [ ] 48 0 0 48 / 128 9.0s Htbl parallel [ ] 49 0 0 49 / 128 9.2s Htbl parallel [ ] 50 0 0 50 / 128 9.5s Htbl parallel [ ] 51 0 0 51 / 128 9.8s Htbl parallel [ ] 52 0 0 52 / 128 10.0s Htbl parallel [ ] 53 0 0 53 / 128 10.4s Htbl parallel [ ] 54 0 0 54 / 128 10.7s Htbl parallel [ ] 55 0 0 55 / 128 10.9s Htbl parallel [ ] 56 0 0 56 / 128 11.1s Htbl parallel [ ] 57 0 0 57 / 128 11.3s Htbl parallel [ ] 58 0 0 58 / 128 11.4s Htbl parallel [ ] 59 0 0 59 / 128 11.5s Htbl parallel [ ] 60 0 0 60 / 128 11.6s Htbl parallel [ ] 61 0 0 61 / 128 11.7s Htbl parallel [ ] 62 0 0 62 / 128 11.9s Htbl parallel [ ] 63 0 0 63 / 128 12.0s Htbl parallel [ ] 64 0 0 64 / 128 12.1s Htbl parallel [ ] 65 0 0 65 / 128 12.2s Htbl parallel [ ] 66 0 0 66 / 128 12.3s Htbl parallel [ ] 67 0 0 67 / 128 12.4s Htbl parallel [ ] 69 0 0 69 / 128 12.6s Htbl parallel [ ] 70 0 0 70 / 128 12.8s Htbl parallel [ ] 72 0 0 72 / 128 13.2s Htbl parallel [ ] 73 0 0 73 / 128 13.3s Htbl parallel [ ] 74 0 0 74 / 128 13.4s Htbl parallel [ ] 75 0 0 75 / 128 13.6s Htbl parallel [ ] 76 0 0 76 / 128 13.7s Htbl parallel [ ] 78 0 0 78 / 128 14.1s Htbl parallel [ ] 79 0 0 79 / 128 14.2s Htbl parallel [ ] 80 0 0 80 / 128 14.4s Htbl parallel [ ] 82 0 0 82 / 128 14.6s Htbl parallel [ ] 83 0 0 83 / 128 14.8s Htbl parallel [ ] 84 0 0 84 / 128 14.9s Htbl parallel [ ] 85 0 0 85 / 128 15.1s Htbl parallel [ ] 86 0 0 86 / 128 15.2s Htbl parallel [ ] 87 0 0 87 / 128 15.4s Htbl parallel [ ] 88 0 0 88 / 128 15.6s Htbl parallel [ ] 89 0 0 89 / 128 15.8s Htbl parallel [ ] 90 0 0 90 / 128 16.0s Htbl parallel [ ] 91 0 0 91 / 128 16.1s Htbl parallel [ ] 93 0 0 93 / 128 16.3s Htbl parallel [ ] 94 0 0 94 / 128 16.6s Htbl parallel [ ] 95 0 0 95 / 128 16.7s Htbl parallel [ ] 96 0 0 96 / 128 17.0s Htbl parallel [ ] 97 0 0 97 / 128 17.3s Htbl parallel [ ] 98 0 0 98 / 128 17.6s Htbl parallel [ ] 99 0 0 99 / 128 17.8s Htbl parallel [ ] 100 0 0 100 / 128 17.9s Htbl parallel [ ] 101 0 0 101 / 128 18.1s Htbl parallel [ ] 102 0 0 102 / 128 18.3s Htbl parallel [ ] 103 0 0 103 / 128 18.5s Htbl parallel [ ] 104 0 0 104 / 128 18.7s Htbl parallel [ ] 105 0 0 105 / 128 18.9s Htbl parallel [ ] 106 0 0 106 / 128 19.3s Htbl parallel [ ] 107 0 0 107 / 128 19.5s Htbl parallel [ ] 108 0 0 108 / 128 19.8s Htbl parallel [ ] 109 0 0 109 / 128 20.0s Htbl parallel [ ] 110 0 0 110 / 128 20.2s Htbl parallel [ ] 111 0 0 111 / 128 20.5s Htbl parallel [ ] 112 0 0 112 / 128 20.8s Htbl parallel [ ] 113 0 0 113 / 128 20.9s Htbl parallel [ ] 114 0 0 114 / 128 21.1s Htbl parallel [ ] 115 0 0 115 / 128 21.4s Htbl parallel [ ] 116 0 0 116 / 128 21.5s Htbl parallel [ ] 117 0 0 117 / 128 21.7s Htbl parallel [ ] 119 0 0 119 / 128 21.9s Htbl parallel [ ] 120 0 0 120 / 128 22.1s Htbl parallel [ ] 121 0 0 121 / 128 22.3s Htbl parallel [ ] 122 0 0 122 / 128 22.6s Htbl parallel [ ] 123 0 0 123 / 128 22.8s Htbl parallel [ ] 124 0 0 124 / 128 23.0s Htbl parallel [ ] 125 0 0 125 / 128 23.2s Htbl parallel [ ] 126 0 0 126 / 128 23.5s Htbl parallel [ ] 127 0 0 127 / 128 23.7s Htbl parallel [ ] 128 0 0 128 / 128 23.9s Htbl parallel [✓] 128 0 0 128 / 128 23.9s Htbl parallel ================================================================================ success (ran 2 tests) random seed: 3070432122467148740 generated error fail pass / total time test name [ ] 0 0 0 0 / 93 0.0s Htbl sequential [✓] 93 0 0 93 / 93 0.0s Htbl sequential [ ] 0 0 0 0 / 93 0.0s Htbl parallel [ ] 1 0 0 1 / 93 0.3s Htbl parallel [ ] 2 0 0 2 / 93 0.5s Htbl parallel [ ] 3 0 0 3 / 93 0.8s Htbl parallel [ ] 4 0 0 4 / 93 1.0s Htbl parallel [ ] 5 0 0 5 / 93 1.3s Htbl parallel [ ] 6 0 0 6 / 93 1.5s Htbl parallel [ ] 7 0 0 7 / 93 2.1s Htbl parallel [ ] 8 0 0 8 / 93 2.2s Htbl parallel [ ] 10 0 0 10 / 93 2.4s Htbl parallel [ ] 12 0 0 12 / 93 2.5s Htbl parallel [ ] 13 0 0 13 / 93 2.6s Htbl parallel [ ] 15 0 0 15 / 93 2.8s Htbl parallel [ ] 16 0 0 16 / 93 3.1s Htbl parallel [ ] 17 0 0 17 / 93 3.3s Htbl parallel [ ] 19 0 0 19 / 93 3.6s Htbl parallel [ ] 20 0 0 20 / 93 3.8s Htbl parallel [ ] 21 0 0 21 / 93 4.0s Htbl parallel [ ] 22 0 0 22 / 93 4.2s Htbl parallel [ ] 24 0 0 24 / 93 4.4s Htbl parallel [ ] 26 0 0 26 / 93 4.6s Htbl parallel [ ] 27 0 0 27 / 93 4.7s Htbl parallel [ ] 29 0 0 29 / 93 4.9s Htbl parallel [ ] 32 0 0 32 / 93 5.0s Htbl parallel [ ] 34 0 0 34 / 93 5.3s Htbl parallel [ ] 36 0 0 36 / 93 5.5s Htbl parallel [ ] 37 0 0 37 / 93 5.8s Htbl parallel [ ] 38 0 0 38 / 93 6.0s Htbl parallel [ ] 40 0 0 40 / 93 6.4s Htbl parallel [ ] 41 0 0 41 / 93 6.5s Htbl parallel [ ] 43 0 0 43 / 93 6.6s Htbl parallel [ ] 45 0 0 45 / 93 6.7s Htbl parallel [ ] 47 0 0 47 / 93 6.9s Htbl parallel [ ] 48 0 0 48 / 93 7.0s Htbl parallel [ ] 49 0 0 49 / 93 7.2s Htbl parallel [ ] 51 0 0 51 / 93 7.5s Htbl parallel [ ] 53 0 0 53 / 93 7.6s Htbl parallel [ ] 56 0 0 56 / 93 7.8s Htbl parallel [ ] 57 0 0 57 / 93 8.1s Htbl parallel [ ] 58 0 0 58 / 93 8.3s Htbl parallel [ ] 59 0 0 59 / 93 8.5s Htbl parallel [ ] 60 0 0 60 / 93 8.9s Htbl parallel [ ] 63 0 0 63 / 93 9.4s Htbl parallel [ ] 65 0 0 65 / 93 9.8s Htbl parallel [ ] 67 0 0 67 / 93 10.0s Htbl parallel [ ] 69 0 0 69 / 93 10.1s Htbl parallel [ ] 70 0 0 70 / 93 10.4s Htbl parallel [ ] 71 0 0 71 / 93 10.6s Htbl parallel [ ] 72 0 0 72 / 93 10.7s Htbl parallel [ ] 73 0 0 73 / 93 10.9s Htbl parallel [ ] 74 0 0 74 / 93 11.2s Htbl parallel [ ] 76 0 0 76 / 93 11.3s Htbl parallel [ ] 79 0 0 79 / 93 11.5s Htbl parallel [ ] 81 0 0 81 / 93 11.7s Htbl parallel [ ] 82 0 0 82 / 93 11.8s Htbl parallel [ ] 85 0 0 85 / 93 12.1s Htbl parallel [ ] 88 0 0 88 / 93 12.3s Htbl parallel [ ] 90 0 0 90 / 93 12.5s Htbl parallel [ ] 91 0 0 91 / 93 12.7s Htbl parallel [ ] 92 0 0 92 / 93 14.2s Htbl parallel [ ] 93 0 0 93 / 93 14.4s Htbl parallel [✓] 93 0 0 93 / 93 14.4s Htbl parallel ================================================================================ success (ran 2 tests) random seed: 3959537213517849292 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Htbl sequential [✓] 32 0 0 32 / 32 0.0s Htbl sequential [ ] 0 0 0 0 / 32 0.0s Htbl parallel [ ] 2 0 0 2 / 32 0.1s Htbl parallel [ ] 4 0 0 4 / 32 0.3s Htbl parallel [ ] 6 0 0 6 / 32 0.5s Htbl parallel [ ] 7 0 0 7 / 32 0.6s Htbl parallel [ ] 8 0 0 8 / 32 0.9s Htbl parallel [ ] 9 0 0 9 / 32 1.2s Htbl parallel [ ] 10 0 0 10 / 32 1.6s Htbl parallel [ ] 11 0 0 11 / 32 1.8s Htbl parallel [ ] 12 0 0 12 / 32 2.0s Htbl parallel [ ] 16 0 0 16 / 32 2.2s Htbl parallel [ ] 17 0 0 17 / 32 2.6s Htbl parallel [ ] 18 0 0 18 / 32 3.2s Htbl parallel [ ] 20 0 0 20 / 32 3.4s Htbl parallel [ ] 21 0 0 21 / 32 3.5s Htbl parallel [ ] 22 0 0 22 / 32 3.8s Htbl parallel [ ] 23 0 0 23 / 32 4.0s Htbl parallel [ ] 25 0 0 25 / 32 4.2s Htbl parallel [ ] 26 0 0 26 / 32 4.5s Htbl parallel [ ] 28 0 0 28 / 32 4.8s Htbl parallel [ ] 29 0 0 29 / 32 6.0s Htbl parallel [ ] 30 0 0 30 / 32 6.1s Htbl parallel [ ] 31 0 0 31 / 32 6.3s Htbl parallel [ ] 32 0 0 32 / 32 6.4s Htbl parallel [✓] 32 0 0 32 / 32 6.4s Htbl parallel ================================================================================ success (ran 2 tests) (cd _build/default/test && ./test_mpmcq.exe) random seed: 4332102318188965492 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Mpmcq sequential [ ] 0 0 0 0 / 32 0.0s Mpmcq sequential (generating) [✓] 32 0 0 32 / 32 0.0s Mpmcq sequential [ ] 0 0 0 0 / 32 0.0s Mpmcq parallel [ ] 1 0 0 1 / 32 0.1s Mpmcq parallel [ ] 3 0 0 3 / 32 0.4s Mpmcq parallel [ ] 5 0 0 5 / 32 0.9s Mpmcq parallel [ ] 6 0 0 6 / 32 1.1s Mpmcq parallel [ ] 7 0 0 7 / 32 1.5s Mpmcq parallel [ ] 8 0 0 8 / 32 1.8s Mpmcq parallel [ ] 9 0 0 9 / 32 2.1s Mpmcq parallel [ ] 10 0 0 10 / 32 2.4s Mpmcq parallel [ ] 11 0 0 11 / 32 2.6s Mpmcq parallel [ ] 12 0 0 12 / 32 2.8s Mpmcq parallel [ ] 14 0 0 14 / 32 2.9s Mpmcq parallel [ ] 15 0 0 15 / 32 3.1s Mpmcq parallel [ ] 17 0 0 17 / 32 3.3s Mpmcq parallel [ ] 19 0 0 19 / 32 3.6s Mpmcq parallel [ ] 20 0 0 20 / 32 4.0s Mpmcq parallel [ ] 21 0 0 21 / 32 4.1s Mpmcq parallel [ ] 23 0 0 23 / 32 4.5s Mpmcq parallel [ ] 24 0 0 24 / 32 4.7s Mpmcq parallel [ ] 25 0 0 25 / 32 4.9s Mpmcq parallel [ ] 27 0 0 27 / 32 5.7s Mpmcq parallel [ ] 28 0 0 28 / 32 6.1s Mpmcq parallel [ ] 29 0 0 29 / 32 6.4s Mpmcq parallel [ ] 30 0 0 30 / 32 6.5s Mpmcq parallel [ ] 31 0 0 31 / 32 6.6s Mpmcq parallel [ ] 32 0 0 32 / 32 7.1s Mpmcq parallel [✓] 32 0 0 32 / 32 7.1s Mpmcq parallel ================================================================================ success (ran 2 tests) random seed: 4052664641342470015 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Mpmcq sequential [✓] 64 0 0 64 / 64 0.0s Mpmcq sequential [ ] 0 0 0 0 / 64 0.0s Mpmcq parallel [ ] 1 0 0 1 / 64 0.3s Mpmcq parallel [ ] 2 0 0 2 / 64 0.4s Mpmcq parallel [ ] 3 0 0 3 / 64 0.6s Mpmcq parallel [ ] 4 0 0 4 / 64 0.7s Mpmcq parallel [ ] 5 0 0 5 / 64 0.8s Mpmcq parallel [ ] 6 0 0 6 / 64 1.3s Mpmcq parallel [ ] 8 0 0 8 / 64 1.6s Mpmcq parallel [ ] 10 0 0 10 / 64 1.8s Mpmcq parallel [ ] 11 0 0 11 / 64 2.0s Mpmcq parallel [ ] 13 0 0 13 / 64 2.1s Mpmcq parallel [ ] 14 0 0 14 / 64 2.2s Mpmcq parallel [ ] 16 0 0 16 / 64 2.4s Mpmcq parallel [ ] 17 0 0 17 / 64 2.5s Mpmcq parallel [ ] 18 0 0 18 / 64 2.7s Mpmcq parallel [ ] 19 0 0 19 / 64 3.1s Mpmcq parallel [ ] 23 0 0 23 / 64 3.2s Mpmcq parallel [ ] 25 0 0 25 / 64 3.4s Mpmcq parallel [ ] 26 0 0 26 / 64 3.5s Mpmcq parallel [ ] 27 0 0 27 / 64 3.7s Mpmcq parallel [ ] 28 0 0 28 / 64 3.9s Mpmcq parallel [ ] 29 0 0 29 / 64 4.0s Mpmcq parallel [ ] 32 0 0 32 / 64 4.3s Mpmcq parallel [ ] 34 0 0 34 / 64 4.4s Mpmcq parallel [ ] 36 0 0 36 / 64 4.5s Mpmcq parallel [ ] 37 0 0 37 / 64 4.7s Mpmcq parallel [ ] 39 0 0 39 / 64 5.0s Mpmcq parallel [ ] 42 0 0 42 / 64 5.2s Mpmcq parallel [ ] 44 0 0 44 / 64 5.3s Mpmcq parallel [ ] 49 0 0 49 / 64 5.4s Mpmcq parallel [ ] 50 0 0 50 / 64 5.6s Mpmcq parallel [ ] 51 0 0 51 / 64 5.7s Mpmcq parallel [ ] 54 0 0 54 / 64 5.9s Mpmcq parallel [ ] 55 0 0 55 / 64 6.0s Mpmcq parallel [ ] 57 0 0 57 / 64 6.4s Mpmcq parallel [ ] 59 0 0 59 / 64 6.5s Mpmcq parallel [ ] 61 0 0 61 / 64 6.7s Mpmcq parallel [ ] 62 0 0 62 / 64 6.9s Mpmcq parallel [ ] 63 0 0 63 / 64 7.2s Mpmcq parallel [✓] 64 0 0 64 / 64 7.2s Mpmcq parallel ================================================================================ success (ran 2 tests) random seed: 1182791591984860128 generated error fail pass / total time test name [ ] 0 0 0 0 / 128 0.0s Mpmcq sequential [✓] 128 0 0 128 / 128 0.0s Mpmcq sequential [ ] 0 0 0 0 / 128 0.0s Mpmcq parallel [ ] 1 0 0 1 / 128 0.0s Mpmcq parallel [ ] 3 0 0 3 / 128 0.2s Mpmcq parallel [ ] 5 0 0 5 / 128 0.4s Mpmcq parallel [ ] 6 0 0 6 / 128 0.9s Mpmcq parallel [ ] 7 0 0 7 / 128 1.1s Mpmcq parallel [ ] 8 0 0 8 / 128 1.2s Mpmcq parallel [ ] 9 0 0 9 / 128 1.4s Mpmcq parallel [ ] 11 0 0 11 / 128 1.5s Mpmcq parallel [ ] 12 0 0 12 / 128 1.7s Mpmcq parallel [ ] 14 0 0 14 / 128 1.9s Mpmcq parallel [ ] 15 0 0 15 / 128 2.3s Mpmcq parallel [ ] 17 0 0 17 / 128 2.5s Mpmcq parallel [ ] 20 0 0 20 / 128 2.6s Mpmcq parallel [ ] 22 0 0 22 / 128 2.7s Mpmcq parallel [ ] 23 0 0 23 / 128 2.9s Mpmcq parallel [ ] 24 0 0 24 / 128 3.0s Mpmcq parallel [ ] 27 0 0 27 / 128 3.1s Mpmcq parallel [ ] 30 0 0 30 / 128 3.3s Mpmcq parallel [ ] 33 0 0 33 / 128 3.5s Mpmcq parallel [ ] 34 0 0 34 / 128 3.6s Mpmcq parallel [ ] 37 0 0 37 / 128 3.8s Mpmcq parallel [ ] 38 0 0 38 / 128 4.0s Mpmcq parallel [ ] 40 0 0 40 / 128 4.1s Mpmcq parallel [ ] 44 0 0 44 / 128 4.4s Mpmcq parallel [ ] 46 0 0 46 / 128 4.5s Mpmcq parallel [ ] 48 0 0 48 / 128 4.7s Mpmcq parallel [ ] 50 0 0 50 / 128 4.8s Mpmcq parallel [ ] 52 0 0 52 / 128 4.9s Mpmcq parallel [ ] 53 0 0 53 / 128 5.2s Mpmcq parallel [ ] 55 0 0 55 / 128 5.6s Mpmcq parallel [ ] 56 0 0 56 / 128 5.7s Mpmcq parallel [ ] 58 0 0 58 / 128 5.9s Mpmcq parallel [ ] 61 0 0 61 / 128 6.1s Mpmcq parallel [ ] 63 0 0 63 / 128 6.4s Mpmcq parallel [ ] 64 0 0 64 / 128 6.6s Mpmcq parallel [ ] 66 0 0 66 / 128 6.7s Mpmcq parallel [ ] 70 0 0 70 / 128 6.8s Mpmcq parallel [ ] 73 0 0 73 / 128 6.9s Mpmcq parallel [ ] 75 0 0 75 / 128 7.1s Mpmcq parallel [ ] 77 0 0 77 / 128 7.2s Mpmcq parallel [ ] 80 0 0 80 / 128 7.3s Mpmcq parallel [ ] 83 0 0 83 / 128 7.6s Mpmcq parallel [ ] 84 0 0 84 / 128 7.7s Mpmcq parallel [ ] 85 0 0 85 / 128 8.2s Mpmcq parallel [ ] 86 0 0 86 / 128 8.3s Mpmcq parallel [ ] 87 0 0 87 / 128 8.7s Mpmcq parallel [ ] 88 0 0 88 / 128 8.8s Mpmcq parallel [ ] 89 0 0 89 / 128 9.2s Mpmcq parallel [ ] 90 0 0 90 / 128 9.6s Mpmcq parallel [ ] 91 0 0 91 / 128 10.0s Mpmcq parallel [ ] 94 0 0 94 / 128 10.2s Mpmcq parallel [ ] 95 0 0 95 / 128 10.3s Mpmcq parallel [ ] 98 0 0 98 / 128 10.5s Mpmcq parallel [ ] 100 0 0 100 / 128 10.6s Mpmcq parallel [ ] 101 0 0 101 / 128 10.8s Mpmcq parallel [ ] 104 0 0 104 / 128 11.0s Mpmcq parallel [ ] 106 0 0 106 / 128 12.0s Mpmcq parallel [ ] 108 0 0 108 / 128 12.3s Mpmcq parallel [ ] 110 0 0 110 / 128 12.5s Mpmcq parallel [ ] 111 0 0 111 / 128 12.7s Mpmcq parallel [ ] 112 0 0 112 / 128 12.9s Mpmcq parallel [ ] 113 0 0 113 / 128 13.0s Mpmcq parallel [ ] 116 0 0 116 / 128 13.1s Mpmcq parallel [ ] 118 0 0 118 / 128 13.3s Mpmcq parallel [ ] 119 0 0 119 / 128 13.4s Mpmcq parallel [ ] 120 0 0 120 / 128 13.5s Mpmcq parallel [ ] 121 0 0 121 / 128 13.7s Mpmcq parallel [ ] 122 0 0 122 / 128 13.9s Mpmcq parallel [ ] 123 0 0 123 / 128 14.0s Mpmcq parallel [ ] 124 0 0 124 / 128 14.1s Mpmcq parallel [ ] 126 0 0 126 / 128 14.3s Mpmcq parallel [ ] 128 0 0 128 / 128 14.5s Mpmcq parallel [✓] 128 0 0 128 / 128 14.5s Mpmcq parallel ================================================================================ success (ran 2 tests) random seed: 4480444929238819214 generated error fail pass / total time test name [ ] 0 0 0 0 / 243 0.0s Mpmcq sequential [✓] 243 0 0 243 / 243 0.0s Mpmcq sequential [ ] 0 0 0 0 / 243 0.0s Mpmcq parallel [ ] 1 0 0 1 / 243 0.3s Mpmcq parallel [ ] 2 0 0 2 / 243 0.5s Mpmcq parallel [ ] 4 0 0 4 / 243 0.7s Mpmcq parallel [ ] 5 0 0 5 / 243 0.9s Mpmcq parallel [ ] 6 0 0 6 / 243 1.0s Mpmcq parallel [ ] 7 0 0 7 / 243 1.2s Mpmcq parallel [ ] 8 0 0 8 / 243 1.4s Mpmcq parallel [ ] 9 0 0 9 / 243 1.7s Mpmcq parallel [ ] 10 0 0 10 / 243 1.9s Mpmcq parallel [ ] 11 0 0 11 / 243 2.1s Mpmcq parallel [ ] 12 0 0 12 / 243 2.2s Mpmcq parallel [ ] 13 0 0 13 / 243 2.5s Mpmcq parallel [ ] 14 0 0 14 / 243 2.8s Mpmcq parallel [ ] 15 0 0 15 / 243 3.0s Mpmcq parallel [ ] 16 0 0 16 / 243 3.3s Mpmcq parallel [ ] 17 0 0 17 / 243 3.4s Mpmcq parallel [ ] 18 0 0 18 / 243 3.6s Mpmcq parallel [ ] 19 0 0 19 / 243 3.7s Mpmcq parallel [ ] 21 0 0 21 / 243 4.0s Mpmcq parallel [ ] 23 0 0 23 / 243 4.1s Mpmcq parallel [ ] 26 0 0 26 / 243 4.3s Mpmcq parallel [ ] 28 0 0 28 / 243 4.5s Mpmcq parallel [ ] 29 0 0 29 / 243 4.6s Mpmcq parallel [ ] 30 0 0 30 / 243 4.7s Mpmcq parallel [ ] 32 0 0 32 / 243 4.9s Mpmcq parallel [ ] 33 0 0 33 / 243 5.1s Mpmcq parallel [ ] 34 0 0 34 / 243 5.2s Mpmcq parallel [ ] 35 0 0 35 / 243 5.3s Mpmcq parallel [ ] 37 0 0 37 / 243 5.5s Mpmcq parallel [ ] 38 0 0 38 / 243 5.6s Mpmcq parallel [ ] 39 0 0 39 / 243 5.7s Mpmcq parallel [ ] 41 0 0 41 / 243 5.9s Mpmcq parallel [ ] 43 0 0 43 / 243 6.0s Mpmcq parallel [ ] 45 0 0 45 / 243 6.3s Mpmcq parallel [ ] 47 0 0 47 / 243 6.4s Mpmcq parallel [ ] 49 0 0 49 / 243 6.6s Mpmcq parallel [ ] 50 0 0 50 / 243 6.8s Mpmcq parallel [ ] 53 0 0 53 / 243 7.1s Mpmcq parallel [ ] 54 0 0 54 / 243 7.3s Mpmcq parallel [ ] 55 0 0 55 / 243 7.4s Mpmcq parallel [ ] 56 0 0 56 / 243 7.5s Mpmcq parallel [ ] 57 0 0 57 / 243 7.7s Mpmcq parallel [ ] 59 0 0 59 / 243 7.9s Mpmcq parallel [ ] 60 0 0 60 / 243 8.0s Mpmcq parallel [ ] 61 0 0 61 / 243 8.2s Mpmcq parallel [ ] 62 0 0 62 / 243 8.3s Mpmcq parallel [ ] 63 0 0 63 / 243 8.8s Mpmcq parallel [ ] 64 0 0 64 / 243 9.0s Mpmcq parallel [ ] 65 0 0 65 / 243 9.2s Mpmcq parallel [ ] 66 0 0 66 / 243 9.4s Mpmcq parallel [ ] 67 0 0 67 / 243 9.6s Mpmcq parallel [ ] 69 0 0 69 / 243 9.9s Mpmcq parallel [ ] 70 0 0 70 / 243 10.0s Mpmcq parallel [ ] 71 0 0 71 / 243 10.2s Mpmcq parallel [ ] 72 0 0 72 / 243 10.4s Mpmcq parallel [ ] 73 0 0 73 / 243 10.5s Mpmcq parallel [ ] 75 0 0 75 / 243 10.7s Mpmcq parallel [ ] 76 0 0 76 / 243 10.8s Mpmcq parallel [ ] 77 0 0 77 / 243 11.0s Mpmcq parallel [ ] 78 0 0 78 / 243 11.1s Mpmcq parallel [ ] 79 0 0 79 / 243 11.2s Mpmcq parallel [ ] 80 0 0 80 / 243 11.4s Mpmcq parallel [ ] 82 0 0 82 / 243 11.5s Mpmcq parallel [ ] 84 0 0 84 / 243 11.7s Mpmcq parallel [ ] 86 0 0 86 / 243 11.9s Mpmcq parallel [ ] 88 0 0 88 / 243 12.0s Mpmcq parallel [ ] 89 0 0 89 / 243 12.3s Mpmcq parallel [ ] 91 0 0 91 / 243 12.7s Mpmcq parallel [ ] 92 0 0 92 / 243 12.8s Mpmcq parallel [ ] 93 0 0 93 / 243 13.2s Mpmcq parallel [ ] 94 0 0 94 / 243 13.5s Mpmcq parallel [ ] 95 0 0 95 / 243 13.7s Mpmcq parallel [ ] 96 0 0 96 / 243 14.0s Mpmcq parallel [ ] 97 0 0 97 / 243 14.3s Mpmcq parallel [ ] 98 0 0 98 / 243 14.6s Mpmcq parallel [ ] 99 0 0 99 / 243 14.9s Mpmcq parallel [ ] 100 0 0 100 / 243 15.2s Mpmcq parallel [ ] 101 0 0 101 / 243 15.9s Mpmcq parallel [ ] 103 0 0 103 / 243 16.0s Mpmcq parallel [ ] 105 0 0 105 / 243 16.2s Mpmcq parallel [ ] 106 0 0 106 / 243 16.4s Mpmcq parallel [ ] 107 0 0 107 / 243 16.6s Mpmcq parallel [ ] 108 0 0 108 / 243 16.8s Mpmcq parallel [ ] 109 0 0 109 / 243 17.0s Mpmcq parallel [ ] 111 0 0 111 / 243 17.2s Mpmcq parallel [ ] 112 0 0 112 / 243 17.3s Mpmcq parallel [ ] 113 0 0 113 / 243 17.5s Mpmcq parallel [ ] 115 0 0 115 / 243 17.9s Mpmcq parallel [ ] 116 0 0 116 / 243 18.0s Mpmcq parallel [ ] 118 0 0 118 / 243 18.2s Mpmcq parallel [ ] 120 0 0 120 / 243 18.3s Mpmcq parallel [ ] 122 0 0 122 / 243 18.5s Mpmcq parallel [ ] 123 0 0 123 / 243 18.6s Mpmcq parallel [ ] 125 0 0 125 / 243 18.7s Mpmcq parallel [ ] 126 0 0 126 / 243 18.8s Mpmcq parallel [ ] 127 0 0 127 / 243 19.0s Mpmcq parallel [ ] 128 0 0 128 / 243 19.2s Mpmcq parallel [ ] 129 0 0 129 / 243 19.4s Mpmcq parallel [ ] 130 0 0 130 / 243 19.7s Mpmcq parallel [ ] 132 0 0 132 / 243 20.0s Mpmcq parallel [ ] 133 0 0 133 / 243 20.3s Mpmcq parallel [ ] 134 0 0 134 / 243 20.4s Mpmcq parallel [ ] 136 0 0 136 / 243 20.6s Mpmcq parallel [ ] 138 0 0 138 / 243 20.9s Mpmcq parallel [ ] 139 0 0 139 / 243 21.0s Mpmcq parallel [ ] 140 0 0 140 / 243 21.1s Mpmcq parallel [ ] 143 0 0 143 / 243 21.3s Mpmcq parallel [ ] 147 0 0 147 / 243 21.6s Mpmcq parallel [ ] 148 0 0 148 / 243 21.7s Mpmcq parallel [ ] 149 0 0 149 / 243 21.9s Mpmcq parallel [ ] 150 0 0 150 / 243 22.0s Mpmcq parallel [ ] 151 0 0 151 / 243 22.3s Mpmcq parallel [ ] 152 0 0 152 / 243 22.6s Mpmcq parallel [ ] 156 0 0 156 / 243 22.9s Mpmcq parallel [ ] 157 0 0 157 / 243 23.0s Mpmcq parallel [ ] 160 0 0 160 / 243 23.1s Mpmcq parallel [ ] 161 0 0 161 / 243 23.5s Mpmcq parallel [ ] 163 0 0 163 / 243 23.7s Mpmcq parallel [ ] 165 0 0 165 / 243 23.8s Mpmcq parallel [ ] 166 0 0 166 / 243 24.0s Mpmcq parallel [ ] 167 0 0 167 / 243 24.2s Mpmcq parallel [ ] 168 0 0 168 / 243 24.3s Mpmcq parallel [ ] 169 0 0 169 / 243 24.4s Mpmcq parallel [ ] 170 0 0 170 / 243 24.6s Mpmcq parallel [ ] 171 0 0 171 / 243 24.9s Mpmcq parallel [ ] 173 0 0 173 / 243 25.0s Mpmcq parallel [ ] 176 0 0 176 / 243 25.2s Mpmcq parallel [ ] 179 0 0 179 / 243 25.5s Mpmcq parallel [ ] 182 0 0 182 / 243 25.6s Mpmcq parallel [ ] 184 0 0 184 / 243 25.7s Mpmcq parallel [ ] 186 0 0 186 / 243 26.0s Mpmcq parallel [ ] 189 0 0 189 / 243 26.1s Mpmcq parallel [ ] 191 0 0 191 / 243 26.9s Mpmcq parallel [ ] 192 0 0 192 / 243 27.8s Mpmcq parallel [ ] 193 0 0 193 / 243 27.9s Mpmcq parallel [ ] 195 0 0 195 / 243 28.2s Mpmcq parallel [ ] 197 0 0 197 / 243 28.4s Mpmcq parallel [ ] 199 0 0 199 / 243 28.5s Mpmcq parallel [ ] 200 0 0 200 / 243 28.6s Mpmcq parallel [ ] 201 0 0 201 / 243 28.8s Mpmcq parallel [ ] 202 0 0 202 / 243 29.0s Mpmcq parallel [ ] 203 0 0 203 / 243 29.7s Mpmcq parallel [ ] 204 0 0 204 / 243 30.0s Mpmcq parallel [ ] 206 0 0 206 / 243 30.2s Mpmcq parallel [ ] 207 0 0 207 / 243 30.3s Mpmcq parallel [ ] 208 0 0 208 / 243 30.7s Mpmcq parallel [ ] 209 0 0 209 / 243 30.9s Mpmcq parallel [ ] 210 0 0 210 / 243 31.3s Mpmcq parallel [ ] 212 0 0 212 / 243 31.6s Mpmcq parallel [ ] 213 0 0 213 / 243 31.8s Mpmcq parallel [ ] 214 0 0 214 / 243 32.0s Mpmcq parallel [ ] 216 0 0 216 / 243 32.1s Mpmcq parallel [ ] 217 0 0 217 / 243 32.5s Mpmcq parallel [ ] 219 0 0 219 / 243 32.6s Mpmcq parallel [ ] 222 0 0 222 / 243 32.9s Mpmcq parallel [ ] 224 0 0 224 / 243 33.0s Mpmcq parallel [ ] 225 0 0 225 / 243 34.0s Mpmcq parallel [ ] 226 0 0 226 / 243 34.2s Mpmcq parallel [ ] 228 0 0 228 / 243 34.4s Mpmcq parallel [ ] 229 0 0 229 / 243 34.5s Mpmcq parallel [ ] 230 0 0 230 / 243 35.1s Mpmcq parallel [ ] 231 0 0 231 / 243 36.5s Mpmcq parallel [ ] 232 0 0 232 / 243 36.8s Mpmcq parallel [ ] 233 0 0 233 / 243 37.1s Mpmcq parallel [ ] 237 0 0 237 / 243 37.2s Mpmcq parallel [ ] 238 0 0 238 / 243 37.3s Mpmcq parallel [ ] 239 0 0 239 / 243 37.7s Mpmcq parallel [ ] 240 0 0 240 / 243 38.0s Mpmcq parallel [ ] 241 0 0 241 / 243 38.4s Mpmcq parallel [ ] 243 0 0 243 / 243 38.5s Mpmcq parallel [✓] 243 0 0 243 / 243 38.5s Mpmcq parallel ================================================================================ success (ran 2 tests) (cd _build/default/test && ./test_mpscq.exe) random seed: 2485624444949140883 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Mpscq sequential [ ] 0 0 0 0 / 32 0.0s Mpscq sequential (generating) [✓] 32 0 0 32 / 32 0.0s Mpscq sequential [ ] 0 0 0 0 / 32 0.0s Mpscq parallel [ ] 1 0 0 1 / 32 0.2s Mpscq parallel [ ] 2 0 0 2 / 32 0.9s Mpscq parallel [ ] 3 0 0 3 / 32 1.0s Mpscq parallel [ ] 5 0 0 5 / 32 1.2s Mpscq parallel [ ] 6 0 0 6 / 32 1.6s Mpscq parallel [ ] 7 0 0 7 / 32 2.9s Mpscq parallel [ ] 8 0 0 8 / 32 3.5s Mpscq parallel [ ] 9 0 0 9 / 32 4.5s Mpscq parallel [ ] 10 0 0 10 / 32 5.6s Mpscq parallel [ ] 11 0 0 11 / 32 6.6s Mpscq parallel [ ] 12 0 0 12 / 32 8.0s Mpscq parallel [ ] 13 0 0 13 / 32 8.8s Mpscq parallel [ ] 14 0 0 14 / 32 9.2s Mpscq parallel [ ] 15 0 0 15 / 32 9.5s Mpscq parallel [ ] 16 0 0 16 / 32 10.2s Mpscq parallel [ ] 17 0 0 17 / 32 10.5s Mpscq parallel [ ] 18 0 0 18 / 32 11.3s Mpscq parallel [ ] 19 0 0 19 / 32 11.5s Mpscq parallel [ ] 20 0 0 20 / 32 11.7s Mpscq parallel [ ] 21 0 0 21 / 32 12.1s Mpscq parallel [ ] 22 0 0 22 / 32 12.5s Mpscq parallel [ ] 23 0 0 23 / 32 12.9s Mpscq parallel [ ] 24 0 0 24 / 32 13.1s Mpscq parallel [ ] 25 0 0 25 / 32 14.0s Mpscq parallel [ ] 26 0 0 26 / 32 14.3s Mpscq parallel [ ] 27 0 0 27 / 32 14.4s Mpscq parallel [ ] 28 0 0 28 / 32 14.7s Mpscq parallel [ ] 29 0 0 29 / 32 15.8s Mpscq parallel [ ] 30 0 0 30 / 32 16.1s Mpscq parallel [ ] 31 0 0 31 / 32 16.9s Mpscq parallel [ ] 32 0 0 32 / 32 17.1s Mpscq parallel [✓] 32 0 0 32 / 32 17.1s Mpscq parallel ================================================================================ success (ran 2 tests) random seed: 2525465284365432692 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Mpscq sequential [✓] 64 0 0 64 / 64 0.0s Mpscq sequential [ ] 0 0 0 0 / 64 0.0s Mpscq parallel [ ] 1 0 0 1 / 64 0.4s Mpscq parallel [ ] 2 0 0 2 / 64 0.5s Mpscq parallel [ ] 3 0 0 3 / 64 0.7s Mpscq parallel [ ] 4 0 0 4 / 64 1.1s Mpscq parallel [ ] 5 0 0 5 / 64 1.4s Mpscq parallel [ ] 6 0 0 6 / 64 1.7s Mpscq parallel [ ] 7 0 0 7 / 64 1.9s Mpscq parallel [ ] 8 0 0 8 / 64 2.2s Mpscq parallel [ ] 9 0 0 9 / 64 2.9s Mpscq parallel [ ] 10 0 0 10 / 64 3.1s Mpscq parallel [ ] 11 0 0 11 / 64 3.3s Mpscq parallel [ ] 12 0 0 12 / 64 3.8s Mpscq parallel [ ] 13 0 0 13 / 64 4.0s Mpscq parallel [ ] 14 0 0 14 / 64 4.1s Mpscq parallel [ ] 15 0 0 15 / 64 4.3s Mpscq parallel [ ] 16 0 0 16 / 64 4.5s Mpscq parallel [ ] 17 0 0 17 / 64 4.6s Mpscq parallel [ ] 18 0 0 18 / 64 4.9s Mpscq parallel [ ] 19 0 0 19 / 64 5.6s Mpscq parallel [ ] 20 0 0 20 / 64 6.9s Mpscq parallel [ ] 21 0 0 21 / 64 7.3s Mpscq parallel [ ] 22 0 0 22 / 64 7.6s Mpscq parallel [ ] 23 0 0 23 / 64 7.8s Mpscq parallel [ ] 24 0 0 24 / 64 8.1s Mpscq parallel [ ] 25 0 0 25 / 64 8.5s Mpscq parallel [ ] 26 0 0 26 / 64 9.0s Mpscq parallel [ ] 27 0 0 27 / 64 9.3s Mpscq parallel [ ] 28 0 0 28 / 64 9.8s Mpscq parallel [ ] 29 0 0 29 / 64 10.3s Mpscq parallel [ ] 30 0 0 30 / 64 10.5s Mpscq parallel [ ] 31 0 0 31 / 64 10.9s Mpscq parallel [ ] 32 0 0 32 / 64 11.5s Mpscq parallel [ ] 33 0 0 33 / 64 11.9s Mpscq parallel [ ] 34 0 0 34 / 64 12.3s Mpscq parallel [ ] 35 0 0 35 / 64 13.3s Mpscq parallel [ ] 36 0 0 36 / 64 13.8s Mpscq parallel [ ] 37 0 0 37 / 64 14.3s Mpscq parallel [ ] 39 0 0 39 / 64 14.7s Mpscq parallel [ ] 40 0 0 40 / 64 15.2s Mpscq parallel [ ] 41 0 0 41 / 64 15.6s Mpscq parallel [ ] 42 0 0 42 / 64 15.8s Mpscq parallel [ ] 43 0 0 43 / 64 16.0s Mpscq parallel [ ] 44 0 0 44 / 64 16.4s Mpscq parallel [ ] 45 0 0 45 / 64 16.6s Mpscq parallel [ ] 46 0 0 46 / 64 17.5s Mpscq parallel [ ] 47 0 0 47 / 64 17.8s Mpscq parallel [ ] 48 0 0 48 / 64 18.0s Mpscq parallel [ ] 49 0 0 49 / 64 18.3s Mpscq parallel [ ] 50 0 0 50 / 64 19.0s Mpscq parallel [ ] 51 0 0 51 / 64 19.5s Mpscq parallel [ ] 52 0 0 52 / 64 19.9s Mpscq parallel [ ] 53 0 0 53 / 64 20.5s Mpscq parallel [ ] 54 0 0 54 / 64 21.1s Mpscq parallel [ ] 55 0 0 55 / 64 21.9s Mpscq parallel [ ] 56 0 0 56 / 64 22.3s Mpscq parallel [ ] 57 0 0 57 / 64 22.6s Mpscq parallel [ ] 58 0 0 58 / 64 23.0s Mpscq parallel [ ] 59 0 0 59 / 64 23.3s Mpscq parallel [ ] 60 0 0 60 / 64 23.6s Mpscq parallel [ ] 61 0 0 61 / 64 23.9s Mpscq parallel [ ] 62 0 0 62 / 64 24.6s Mpscq parallel [ ] 63 0 0 63 / 64 25.4s Mpscq parallel [ ] 64 0 0 64 / 64 26.2s Mpscq parallel [✓] 64 0 0 64 / 64 26.2s Mpscq parallel ================================================================================ success (ran 2 tests) random seed: 1004892259005584219 generated error fail pass / total time test name [ ] 0 0 0 0 / 37 0.0s Mpscq sequential [✓] 37 0 0 37 / 37 0.0s Mpscq sequential [ ] 0 0 0 0 / 37 0.0s Mpscq parallel [ ] 1 0 0 1 / 37 0.6s Mpscq parallel [ ] 2 0 0 2 / 37 1.6s Mpscq parallel [ ] 3 0 0 3 / 37 1.8s Mpscq parallel [ ] 4 0 0 4 / 37 2.5s Mpscq parallel [ ] 5 0 0 5 / 37 3.4s Mpscq parallel [ ] 6 0 0 6 / 37 3.7s Mpscq parallel [ ] 7 0 0 7 / 37 3.8s Mpscq parallel [ ] 8 0 0 8 / 37 4.0s Mpscq parallel [ ] 9 0 0 9 / 37 4.1s Mpscq parallel [ ] 11 0 0 11 / 37 4.6s Mpscq parallel [ ] 12 0 0 12 / 37 5.3s Mpscq parallel [ ] 13 0 0 13 / 37 5.7s Mpscq parallel [ ] 14 0 0 14 / 37 5.9s Mpscq parallel [ ] 15 0 0 15 / 37 6.1s Mpscq parallel [ ] 16 0 0 16 / 37 6.4s Mpscq parallel [ ] 17 0 0 17 / 37 6.5s Mpscq parallel [ ] 18 0 0 18 / 37 6.8s Mpscq parallel [ ] 19 0 0 19 / 37 6.9s Mpscq parallel [ ] 20 0 0 20 / 37 7.4s Mpscq parallel [ ] 21 0 0 21 / 37 8.1s Mpscq parallel [ ] 22 0 0 22 / 37 8.3s Mpscq parallel [ ] 23 0 0 23 / 37 8.5s Mpscq parallel [ ] 24 0 0 24 / 37 9.0s Mpscq parallel [ ] 25 0 0 25 / 37 9.1s Mpscq parallel [ ] 26 0 0 26 / 37 9.2s Mpscq parallel [ ] 27 0 0 27 / 37 9.5s Mpscq parallel [ ] 28 0 0 28 / 37 9.8s Mpscq parallel [ ] 29 0 0 29 / 37 10.0s Mpscq parallel [ ] 30 0 0 30 / 37 10.5s Mpscq parallel [ ] 32 0 0 32 / 37 10.8s Mpscq parallel [ ] 33 0 0 33 / 37 11.0s Mpscq parallel [ ] 34 0 0 34 / 37 11.2s Mpscq parallel [ ] 35 0 0 35 / 37 11.5s Mpscq parallel [ ] 37 0 0 37 / 37 13.3s Mpscq parallel [✓] 37 0 0 37 / 37 13.3s Mpscq parallel ================================================================================ success (ran 2 tests) random seed: 4413087501731472483 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Mpscq sequential [✓] 32 0 0 32 / 32 0.0s Mpscq sequential [ ] 0 0 0 0 / 32 0.0s Mpscq parallel [ ] 1 0 0 1 / 32 0.3s Mpscq parallel [ ] 2 0 0 2 / 32 0.4s Mpscq parallel [ ] 3 0 0 3 / 32 0.7s Mpscq parallel [ ] 4 0 0 4 / 32 1.0s Mpscq parallel [ ] 5 0 0 5 / 32 1.5s Mpscq parallel [ ] 6 0 0 6 / 32 2.2s Mpscq parallel [ ] 7 0 0 7 / 32 2.4s Mpscq parallel [ ] 8 0 0 8 / 32 2.5s Mpscq parallel [ ] 9 0 0 9 / 32 2.9s Mpscq parallel [ ] 10 0 0 10 / 32 3.4s Mpscq parallel [ ] 11 0 0 11 / 32 3.7s Mpscq parallel [ ] 12 0 0 12 / 32 4.0s Mpscq parallel [ ] 13 0 0 13 / 32 4.2s Mpscq parallel [ ] 14 0 0 14 / 32 4.6s Mpscq parallel [ ] 15 0 0 15 / 32 4.8s Mpscq parallel [ ] 16 0 0 16 / 32 5.1s Mpscq parallel [ ] 17 0 0 17 / 32 5.3s Mpscq parallel [ ] 18 0 0 18 / 32 6.2s Mpscq parallel [ ] 20 0 0 20 / 32 6.5s Mpscq parallel [ ] 21 0 0 21 / 32 6.9s Mpscq parallel [ ] 22 0 0 22 / 32 8.7s Mpscq parallel [ ] 23 0 0 23 / 32 9.3s Mpscq parallel [ ] 24 0 0 24 / 32 10.0s Mpscq parallel [ ] 25 0 0 25 / 32 10.6s Mpscq parallel [ ] 26 0 0 26 / 32 11.0s Mpscq parallel [ ] 27 0 0 27 / 32 12.6s Mpscq parallel [ ] 28 0 0 28 / 32 13.6s Mpscq parallel [ ] 29 0 0 29 / 32 14.2s Mpscq parallel [ ] 31 0 0 31 / 32 17.7s Mpscq parallel [ ] 32 0 0 32 / 32 18.1s Mpscq parallel [✓] 32 0 0 32 / 32 18.1s Mpscq parallel ================================================================================ success (ran 2 tests) (cd _build/default/test && ./test_lwt_unix.exe) Testing `Picos_lwt'. This run has ID `XIZP29HW'. [OK] Basics 0 Full test results in `~/src/_build/default/test/_build/_tests/Picos_lwt'. Test Successful in 0.056s. 1 test run. (cd _build/default/test && ./test_picos_lwt_unix_with_cohttp.exe) Uri: //127.0.0.1:8000/hello-lwt Method: GET host: 127.0.0.1:8000 user-agent: ocaml-cohttp/v6.0.0 content-length: 0 Body: (cd _build/default/test && ./test_server_and_client.exe) Using blocking sockets and fibers on OCaml 5: Looping server running Client B running Client A running Server listening Server accepting Server accepted client Server accepting Client A connected Client A wrote 100 Server read 100 Client A read 50 Server wrote 50 Client B connected Client B wrote 100 Server accepted client Server read 100 Server wrote 50 Server accepting Client B read 50 Server and Client test: OK (cd _build/default/test && ./test_io.exe) Testing `Picos_io'. This run has ID `MYX6MQLF'. [OK] Unix 0 openfile and read. [OK] Unix 1 sleepf. [OK] Unix 2 select empty timeout. [OK] Unix 3 select empty ∞. [OK] Unix 4 select. [OK] Unix 5 system. Full test results in `~/src/_build/default/test/_build/_tests/Picos_io'. Test Successful in 0.501s. 6 tests run. (cd _build/default/test && ./test_io_with_lwt.exe) Testing `Picos_io_with_lwt'. This run has ID `APDHYXLE'. [OK] Unix 0 system. Full test results in `~/src/_build/default/test/_build/_tests/Picos_io_with_lwt'. Test Successful in 2.038s. 1 test run. (cd _build/default/test && ./test_select.exe) Testing `Picos_select'. This run has ID `BGFFJU6Y'. [OK] Intr 0 Full test results in `~/src/_build/default/test/_build/_tests/Picos_select'. Test Successful in 7.219s. 1 test run. (cd _build/default/test && ./test_finally.exe) Testing `Picos_finally'. This run has ID `A2DJ6X6Z'. [OK] move 0 is lazy. [OK] borrow 0 returns resource. Full test results in `~/src/_build/default/test/_build/_tests/Picos_finally'. Test Successful in 0.003s. 2 tests run. (cd _build/default/test && ./test_structured.exe) Testing `Picos_structured'. This run has ID `BP1PLTDA'. [OK] Bundle 0 fork after terminate. [OK] Bundle 1 fork after escape. [OK] Bundle 2 exception in child terminates. [OK] Bundle 3 cancelation awaits children. [OK] Bundle 4 block raises when forbidden. [OK] Bundle 5 block raises Sys_error when fiber finishes. [OK] Bundle 6 termination nests. [OK] Bundle 7 promise cancelation does not terminate. [OK] Bundle 8 error in promise terminates. [OK] Bundle 9 can wait promises. [OK] Bundle 10 can select promises. [OK] Run 0 any and all errors. [OK] Run 1 any and all returns. [OK] Run 2 race any. Full test results in `~/src/_build/default/test/_build/_tests/Picos_structured'. Test Successful in 0.831s. 14 tests run. (cd _build/default/test && ./test_sync.exe -- Event 0) Testing `Picos_sync'. This run has ID `Q79U6UI4'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [OK] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.109s. 1 test run. (cd _build/default/test && ./test_io_cohttp.exe) Uri: //127.0.0.1:29269/hello-io-cohttp Method: POST host: 127.0.0.1:29269 user-agent: ocaml-cohttp/v6.0.0 content-length: 17 Body: It's-a-Me, Picos! (cd _build/default/bench && ./main.exe -brief 'Picos Computation') Picos Computation: attach detach pairs over time/1 worker: 5.86 M/s attach detach pairs over time/2 workers: 3.81 M/s attach detach pairs over time/4 workers: 2.48 M/s attach detach pairs over time/trivial: 13.46 M/s time per attach detach pair/1 worker: 170.66 ns time per attach detach pair/2 workers: 524.47 ns time per attach detach pair/4 workers: 1615.38 ns time per attach detach pair/trivial: 74.31 ns (cd _build/default/test && ./test_sync.exe -- Lazy 0) Testing `Picos_sync'. This run has ID `X6YO8J3E'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [OK] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.003s. 1 test run. (cd _build/default/bench && ./main.exe -brief 'Picos Current') Picos Current: ops over time/1 worker: 22.22 M/s ops over time/2 workers: 43.80 M/s ops over time/4 workers: 26.83 M/s time per op/1 worker: 45.01 ns time per op/2 workers: 45.66 ns time per op/4 workers: 149.07 ns (cd _build/default/test && ./test_sync.exe -- Lazy 1) Testing `Picos_sync'. This run has ID `UEF43W11'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Lazy 0 basics. [OK] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.003s. 1 test run. (cd _build/default/bench && ./main.exe -brief 'Picos FLS (excluding Current)') Picos FLS (excluding Current): gets over time/1 worker: 140.53 M/s gets over time/2 workers: 259.15 M/s gets over time/4 workers: 95.21 M/s sets over time/1 worker: 69.84 M/s sets over time/2 workers: 123.32 M/s sets over time/4 workers: 29.44 M/s time per get/1 worker: 7.12 ns time per get/2 workers: 7.72 ns time per get/4 workers: 42.01 ns time per set/1 worker: 14.32 ns time per set/2 workers: 16.22 ns time per set/4 workers: 135.89 ns (cd _build/default/test && ./test_sync.exe -- Semaphore 0) Testing `Picos_sync'. This run has ID `7187PEYF'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [OK] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.003s. 1 test run. (cd _build/default/bench && ./main.exe -brief 'Picos TLS') Picos TLS: gets over time/1 worker: 21.68 M/s gets over time/2 workers: 47.24 M/s gets over time/4 workers: 28.38 M/s sets over time/1 worker: 16.03 M/s sets over time/2 workers: 36.51 M/s sets over time/4 workers: 18.65 M/s time per get/1 worker: 46.13 ns time per get/2 workers: 42.34 ns time per get/4 workers: 140.96 ns time per set/1 worker: 62.40 ns time per set/2 workers: 54.77 ns time per set/4 workers: 214.50 ns (cd _build/default/test && ./test_sync.exe -- Semaphore 1) Testing `Picos_sync'. This run has ID `Q2WWX1UB'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [OK] Semaphore 1 stress. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.091s. 1 test run. (cd _build/default/bench && ./main.exe -brief 'Picos DLS') Picos DLS: gets over time/1 worker: 78.33 M/s gets over time/2 workers: 156.61 M/s gets over time/4 workers: 54.97 M/s sets over time/1 worker: 48.43 M/s sets over time/2 workers: 94.27 M/s sets over time/4 workers: 31.21 M/s time per get/1 worker: 12.77 ns time per get/2 workers: 12.77 ns time per get/4 workers: 72.77 ns time per set/1 worker: 20.65 ns time per set/2 workers: 21.21 ns time per set/4 workers: 128.16 ns (cd _build/default/test && ./test_sync.exe -- 'Non-cancelable ops' 0) Testing `Picos_sync'. This run has ID `FKNJ7FHT'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [OK] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.003s. 1 test run. (cd _build/default/test && ./test_schedulers.exe) Testing `Picos schedulers'. This run has ID `Z69HG790'. [OK] Trivial main returns 0 [OK] Scheduler completes main computation 0 [OK] Current 0 [OK] Cancel_after 0 basic. [OK] Cancel_after 1 long timeout. [OK] Operation on canceled fiber raises 0 [OK] Cross scheduler wakeup 0 [OK] Fatal exception terminates scheduler 0 Full test results in `~/src/_build/default/test/_build/_tests/Picos schedulers'. Test Successful in 50.814s. 8 tests run. (cd _build/default/test && ./test_sync.exe -- 'Mutex and Condition' 0) Testing `Picos_sync'. This run has ID `3QUGYPNV'. [OK] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.004s. 1 test run. (cd _build/default/test && ./test_sync.exe -- 'Mutex and Condition' 1) Testing `Picos_sync'. This run has ID `SIFFCA1N'. [SKIP] Mutex and Condition 0 basics. [OK] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.003s. 1 test run. (cd _build/default/test && ./test_sync.exe -- 'Mutex and Condition' 2) Testing `Picos_sync'. This run has ID `XEDH71RZ'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [OK] Mutex and Condition 2 cancelation. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~/src/_build/default/test/_build/_tests/Picos_sync'. Test Successful in 0.532s. 1 test run. (cd _build/default/test && ./test_picos.exe) Testing `Picos'. This run has ID `XVVP28JX'. [OK] Trigger 0 basics. [OK] Computation 0 basics. [OK] Computation 1 tx. [OK] Computation 2 signals in order. [OK] Fiber.FLS 0 basics. [OK] Cancel 0 [OK] Cancel after 0 Full test results in `~/src/_build/default/test/_build/_tests/Picos'. Test Successful in 53.833s. 7 tests run. (cd _build/default/test && ./test_sync_queue.exe) random seed: 3358620915296923426 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Picos_std_sync.Queue sequential [ ] 0 0 0 0 / 32 0.0s Picos_std_sync.Queue sequential (generating) [✓] 32 0 0 32 / 32 0.0s Picos_std_sync.Queue sequential [ ] 0 0 0 0 / 32 0.0s Picos_std_sync.Queue parallel [ ] 5 0 0 5 / 32 0.1s Picos_std_sync.Queue parallel [ ] 10 0 0 10 / 32 0.2s Picos_std_sync.Queue parallel [ ] 14 0 0 14 / 32 0.4s Picos_std_sync.Queue parallel [ ] 19 0 0 19 / 32 0.5s Picos_std_sync.Queue parallel [ ] 20 0 0 20 / 32 0.6s Picos_std_sync.Queue parallel [ ] 22 0 0 22 / 32 0.7s Picos_std_sync.Queue parallel [ ] 26 0 0 26 / 32 0.8s Picos_std_sync.Queue parallel [ ] 28 0 0 28 / 32 0.9s Picos_std_sync.Queue parallel [ ] 32 0 0 32 / 32 1.1s Picos_std_sync.Queue parallel [✓] 32 0 0 32 / 32 1.1s Picos_std_sync.Queue parallel ================================================================================ success (ran 2 tests) random seed: 2726963864700410432 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Picos_std_sync.Queue sequential [✓] 64 0 0 64 / 64 0.0s Picos_std_sync.Queue sequential [ ] 0 0 0 0 / 64 0.0s Picos_std_sync.Queue parallel [ ] 4 0 0 4 / 64 0.3s Picos_std_sync.Queue parallel [ ] 8 0 0 8 / 64 0.5s Picos_std_sync.Queue parallel [ ] 12 0 0 12 / 64 0.6s Picos_std_sync.Queue parallel [ ] 16 0 0 16 / 64 0.7s Picos_std_sync.Queue parallel [ ] 20 0 0 20 / 64 0.8s Picos_std_sync.Queue parallel [ ] 24 0 0 24 / 64 0.9s Picos_std_sync.Queue parallel [ ] 26 0 0 26 / 64 1.2s Picos_std_sync.Queue parallel [ ] 29 0 0 29 / 64 1.3s Picos_std_sync.Queue parallel [ ] 34 0 0 34 / 64 1.4s Picos_std_sync.Queue parallel [ ] 40 0 0 40 / 64 1.5s Picos_std_sync.Queue parallel [ ] 43 0 0 43 / 64 1.7s Picos_std_sync.Queue parallel [ ] 47 0 0 47 / 64 1.8s Picos_std_sync.Queue parallel [ ] 50 0 0 50 / 64 2.0s Picos_std_sync.Queue parallel [ ] 53 0 0 53 / 64 2.1s Picos_std_sync.Queue parallel [ ] 58 0 0 58 / 64 2.2s Picos_std_sync.Queue parallel [ ] 59 0 0 59 / 64 2.5s Picos_std_sync.Queue parallel [ ] 61 0 0 61 / 64 2.7s Picos_std_sync.Queue parallel [✓] 64 0 0 64 / 64 2.8s Picos_std_sync.Queue parallel ================================================================================ success (ran 2 tests) random seed: 4401384767248963015 generated error fail pass / total time test name [ ] 0 0 0 0 / 128 0.0s Picos_std_sync.Queue sequential [✓] 128 0 0 128 / 128 0.0s Picos_std_sync.Queue sequential [ ] 0 0 0 0 / 128 0.0s Picos_std_sync.Queue parallel [ ] 2 0 0 2 / 128 0.0s Picos_std_sync.Queue parallel [ ] 6 0 0 6 / 128 0.2s Picos_std_sync.Queue parallel [ ] 11 0 0 11 / 128 0.3s Picos_std_sync.Queue parallel [ ] 17 0 0 17 / 128 0.4s Picos_std_sync.Queue parallel [ ] 21 0 0 21 / 128 0.6s Picos_std_sync.Queue parallel [ ] 24 0 0 24 / 128 0.9s Picos_std_sync.Queue parallel [ ] 29 0 0 29 / 128 1.0s Picos_std_sync.Queue parallel [ ] 31 0 0 31 / 128 1.2s Picos_std_sync.Queue parallel [ ] 36 0 0 36 / 128 1.3s Picos_std_sync.Queue parallel [ ] 41 0 0 41 / 128 1.5s Picos_std_sync.Queue parallel [ ] 43 0 0 43 / 128 1.8s Picos_std_sync.Queue parallel [ ] 47 0 0 47 / 128 1.9s Picos_std_sync.Queue parallel [ ] 51 0 0 51 / 128 2.2s Picos_std_sync.Queue parallel [ ] 52 0 0 52 / 128 4.2s Picos_std_sync.Queue parallel [ ] 53 0 0 53 / 128 4.4s Picos_std_sync.Queue parallel [ ] 54 0 0 54 / 128 4.6s Picos_std_sync.Queue parallel [ ] 55 0 0 55 / 128 4.9s Picos_std_sync.Queue parallel [ ] 56 0 0 56 / 128 5.1s Picos_std_sync.Queue parallel [ ] 57 0 0 57 / 128 5.3s Picos_std_sync.Queue parallel [ ] 58 0 0 58 / 128 5.7s Picos_std_sync.Queue parallel [ ] 59 0 0 59 / 128 5.8s Picos_std_sync.Queue parallel [ ] 61 0 0 61 / 128 6.2s Picos_std_sync.Queue parallel [ ] 62 0 0 62 / 128 6.4s Picos_std_sync.Queue parallel [ ] 63 0 0 63 / 128 6.7s Picos_std_sync.Queue parallel [ ] 64 0 0 64 / 128 7.1s Picos_std_sync.Queue parallel [ ] 65 0 0 65 / 128 7.4s Picos_std_sync.Queue parallel [ ] 66 0 0 66 / 128 7.8s Picos_std_sync.Queue parallel [ ] 67 0 0 67 / 128 8.1s Picos_std_sync.Queue parallel [ ] 68 0 0 68 / 128 8.4s Picos_std_sync.Queue parallel [ ] 69 0 0 69 / 128 8.7s Picos_std_sync.Queue parallel [ ] 70 0 0 70 / 128 8.8s Picos_std_sync.Queue parallel [ ] 71 0 0 71 / 128 8.9s Picos_std_sync.Queue parallel [ ] 73 0 0 73 / 128 9.1s Picos_std_sync.Queue parallel [ ] 75 0 0 75 / 128 9.2s Picos_std_sync.Queue parallel [ ] 78 0 0 78 / 128 9.4s Picos_std_sync.Queue parallel [ ] 80 0 0 80 / 128 9.5s Picos_std_sync.Queue parallel [ ] 81 0 0 81 / 128 9.6s Picos_std_sync.Queue parallel [ ] 83 0 0 83 / 128 9.7s Picos_std_sync.Queue parallel [ ] 84 0 0 84 / 128 10.5s Picos_std_sync.Queue parallel [ ] 85 0 0 85 / 128 11.5s Picos_std_sync.Queue parallel [ ] 86 0 0 86 / 128 12.6s Picos_std_sync.Queue parallel [ ] 87 0 0 87 / 128 12.9s Picos_std_sync.Queue parallel [ ] 89 0 0 89 / 128 13.1s Picos_std_sync.Queue parallel [ ] 90 0 0 90 / 128 13.3s Picos_std_sync.Queue parallel [ ] 91 0 0 91 / 128 13.8s Picos_std_sync.Queue parallel [ ] 93 0 0 93 / 128 15.8s Picos_std_sync.Queue parallel [ ] 94 0 0 94 / 128 16.4s Picos_std_sync.Queue parallel [ ] 95 0 0 95 / 128 16.9s Picos_std_sync.Queue parallel [ ] 96 0 0 96 / 128 17.4s Picos_std_sync.Queue parallel [ ] 97 0 0 97 / 128 19.5s Picos_std_sync.Queue parallel [ ] 98 0 0 98 / 128 20.0s Picos_std_sync.Queue parallel [ ] 99 0 0 99 / 128 20.4s Picos_std_sync.Queue parallel [ ] 100 0 0 100 / 128 21.4s Picos_std_sync.Queue parallel [ ] 101 0 0 101 / 128 21.8s Picos_std_sync.Queue parallel [ ] 102 0 0 102 / 128 22.0s Picos_std_sync.Queue parallel [ ] 103 0 0 103 / 128 22.8s Picos_std_sync.Queue parallel [ ] 104 0 0 104 / 128 23.2s Picos_std_sync.Queue parallel [ ] 105 0 0 105 / 128 23.7s Picos_std_sync.Queue parallel [ ] 106 0 0 106 / 128 23.8s Picos_std_sync.Queue parallel [ ] 107 0 0 107 / 128 24.6s Picos_std_sync.Queue parallel [ ] 108 0 0 108 / 128 25.0s Picos_std_sync.Queue parallel [ ] 109 0 0 109 / 128 25.5s Picos_std_sync.Queue parallel [ ] 110 0 0 110 / 128 25.9s Picos_std_sync.Queue parallel [ ] 111 0 0 111 / 128 26.2s Picos_std_sync.Queue parallel [ ] 112 0 0 112 / 128 26.5s Picos_std_sync.Queue parallel [ ] 113 0 0 113 / 128 27.0s Picos_std_sync.Queue parallel [ ] 114 0 0 114 / 128 27.6s Picos_std_sync.Queue parallel [ ] 115 0 0 115 / 128 28.2s Picos_std_sync.Queue parallel [ ] 116 0 0 116 / 128 29.0s Picos_std_sync.Queue parallel [ ] 117 0 0 117 / 128 29.6s Picos_std_sync.Queue parallel [ ] 118 0 0 118 / 128 30.3s Picos_std_sync.Queue parallel [ ] 119 0 0 119 / 128 30.7s Picos_std_sync.Queue parallel [ ] 120 0 0 120 / 128 31.2s Picos_std_sync.Queue parallel [ ] 121 0 0 121 / 128 31.6s Picos_std_sync.Queue parallel [ ] 122 0 0 122 / 128 32.2s Picos_std_sync.Queue parallel [ ] 123 0 0 123 / 128 32.3s Picos_std_sync.Queue parallel [ ] 124 0 0 124 / 128 33.2s Picos_std_sync.Queue parallel [ ] 125 0 0 125 / 128 34.1s Picos_std_sync.Queue parallel [ ] 126 0 0 126 / 128 35.1s Picos_std_sync.Queue parallel [ ] 127 0 0 127 / 128 35.6s Picos_std_sync.Queue parallel [ ] 128 0 0 128 / 128 36.2s Picos_std_sync.Queue parallel [✓] 128 0 0 128 / 128 36.2s Picos_std_sync.Queue parallel ================================================================================ success (ran 2 tests) random seed: 2962210146906259439 generated error fail pass / total time test name [ ] 0 0 0 0 / 112 0.0s Picos_std_sync.Queue sequential [✓] 112 0 0 112 / 112 0.0s Picos_std_sync.Queue sequential [ ] 0 0 0 0 / 112 0.0s Picos_std_sync.Queue parallel [ ] 1 0 0 1 / 112 0.7s Picos_std_sync.Queue parallel [ ] 2 0 0 2 / 112 1.2s Picos_std_sync.Queue parallel [ ] 3 0 0 3 / 112 1.9s Picos_std_sync.Queue parallel [ ] 4 0 0 4 / 112 2.4s Picos_std_sync.Queue parallel [ ] 5 0 0 5 / 112 3.3s Picos_std_sync.Queue parallel [ ] 6 0 0 6 / 112 3.8s Picos_std_sync.Queue parallel [ ] 7 0 0 7 / 112 4.2s Picos_std_sync.Queue parallel [ ] 8 0 0 8 / 112 4.4s Picos_std_sync.Queue parallel [ ] 10 0 0 10 / 112 4.7s Picos_std_sync.Queue parallel [ ] 12 0 0 12 / 112 5.0s Picos_std_sync.Queue parallel [ ] 13 0 0 13 / 112 5.2s Picos_std_sync.Queue parallel [ ] 15 0 0 15 / 112 5.3s Picos_std_sync.Queue parallel [ ] 16 0 0 16 / 112 5.5s Picos_std_sync.Queue parallel [ ] 18 0 0 18 / 112 5.7s Picos_std_sync.Queue parallel [ ] 19 0 0 19 / 112 5.8s Picos_std_sync.Queue parallel [ ] 20 0 0 20 / 112 6.0s Picos_std_sync.Queue parallel [ ] 22 0 0 22 / 112 6.4s Picos_std_sync.Queue parallel [ ] 23 0 0 23 / 112 6.5s Picos_std_sync.Queue parallel [ ] 24 0 0 24 / 112 6.7s Picos_std_sync.Queue parallel [ ] 25 0 0 25 / 112 7.1s Picos_std_sync.Queue parallel [ ] 27 0 0 27 / 112 7.2s Picos_std_sync.Queue parallel [ ] 29 0 0 29 / 112 7.4s Picos_std_sync.Queue parallel [ ] 31 0 0 31 / 112 7.6s Picos_std_sync.Queue parallel [ ] 32 0 0 32 / 112 7.8s Picos_std_sync.Queue parallel [ ] 33 0 0 33 / 112 8.1s Picos_std_sync.Queue parallel [ ] 35 0 0 35 / 112 8.3s Picos_std_sync.Queue parallel [ ] 37 0 0 37 / 112 8.5s Picos_std_sync.Queue parallel [ ] 38 0 0 38 / 112 8.6s Picos_std_sync.Queue parallel [ ] 39 0 0 39 / 112 8.8s Picos_std_sync.Queue parallel [ ] 40 0 0 40 / 112 9.2s Picos_std_sync.Queue parallel [ ] 41 0 0 41 / 112 9.4s Picos_std_sync.Queue parallel [ ] 43 0 0 43 / 112 9.5s Picos_std_sync.Queue parallel [ ] 44 0 0 44 / 112 9.9s Picos_std_sync.Queue parallel [ ] 45 0 0 45 / 112 10.1s Picos_std_sync.Queue parallel [ ] 46 0 0 46 / 112 10.3s Picos_std_sync.Queue parallel [ ] 47 0 0 47 / 112 11.0s Picos_std_sync.Queue parallel [ ] 48 0 0 48 / 112 11.2s Picos_std_sync.Queue parallel [ ] 49 0 0 49 / 112 11.3s Picos_std_sync.Queue parallel [ ] 50 0 0 50 / 112 11.5s Picos_std_sync.Queue parallel [ ] 51 0 0 51 / 112 11.7s Picos_std_sync.Queue parallel [ ] 52 0 0 52 / 112 12.0s Picos_std_sync.Queue parallel [ ] 53 0 0 53 / 112 12.2s Picos_std_sync.Queue parallel [ ] 54 0 0 54 / 112 12.4s Picos_std_sync.Queue parallel [ ] 55 0 0 55 / 112 12.6s Picos_std_sync.Queue parallel [ ] 57 0 0 57 / 112 13.0s Picos_std_sync.Queue parallel [ ] 58 0 0 58 / 112 13.1s Picos_std_sync.Queue parallel [ ] 60 0 0 60 / 112 13.3s Picos_std_sync.Queue parallel [ ] 61 0 0 61 / 112 13.6s Picos_std_sync.Queue parallel [ ] 62 0 0 62 / 112 13.9s Picos_std_sync.Queue parallel [ ] 63 0 0 63 / 112 14.1s Picos_std_sync.Queue parallel [ ] 64 0 0 64 / 112 14.3s Picos_std_sync.Queue parallel [ ] 65 0 0 65 / 112 14.7s Picos_std_sync.Queue parallel [ ] 66 0 0 66 / 112 14.8s Picos_std_sync.Queue parallel [ ] 68 0 0 68 / 112 15.2s Picos_std_sync.Queue parallel [ ] 69 0 0 69 / 112 15.5s Picos_std_sync.Queue parallel [ ] 70 0 0 70 / 112 15.9s Picos_std_sync.Queue parallel [ ] 71 0 0 71 / 112 16.0s Picos_std_sync.Queue parallel [ ] 73 0 0 73 / 112 16.3s Picos_std_sync.Queue parallel [ ] 74 0 0 74 / 112 16.6s Picos_std_sync.Queue parallel [ ] 75 0 0 75 / 112 16.9s Picos_std_sync.Queue parallel [ ] 76 0 0 76 / 112 17.1s Picos_std_sync.Queue parallel [ ] 77 0 0 77 / 112 17.4s Picos_std_sync.Queue parallel [ ] 78 0 0 78 / 112 17.8s Picos_std_sync.Queue parallel [ ] 81 0 0 81 / 112 18.0s Picos_std_sync.Queue parallel [ ] 83 0 0 83 / 112 18.3s Picos_std_sync.Queue parallel [ ] 84 0 0 84 / 112 18.5s Picos_std_sync.Queue parallel [ ] 86 0 0 86 / 112 18.6s Picos_std_sync.Queue parallel [ ] 88 0 0 88 / 112 18.8s Picos_std_sync.Queue parallel [ ] 89 0 0 89 / 112 19.0s Picos_std_sync.Queue parallel [ ] 91 0 0 91 / 112 19.2s Picos_std_sync.Queue parallel [ ] 93 0 0 93 / 112 19.4s Picos_std_sync.Queue parallel [ ] 94 0 0 94 / 112 19.6s Picos_std_sync.Queue parallel [ ] 96 0 0 96 / 112 19.8s Picos_std_sync.Queue parallel [ ] 99 0 0 99 / 112 20.0s Picos_std_sync.Queue parallel [ ] 101 0 0 101 / 112 20.3s Picos_std_sync.Queue parallel [ ] 102 0 0 102 / 112 20.6s Picos_std_sync.Queue parallel [ ] 104 0 0 104 / 112 21.0s Picos_std_sync.Queue parallel [ ] 106 0 0 106 / 112 21.5s Picos_std_sync.Queue parallel [ ] 107 0 0 107 / 112 21.7s Picos_std_sync.Queue parallel [ ] 108 0 0 108 / 112 21.9s Picos_std_sync.Queue parallel [ ] 109 0 0 109 / 112 22.3s Picos_std_sync.Queue parallel [ ] 110 0 0 110 / 112 22.7s Picos_std_sync.Queue parallel [ ] 111 0 0 111 / 112 22.9s Picos_std_sync.Queue parallel [ ] 112 0 0 112 / 112 23.1s Picos_std_sync.Queue parallel [✓] 112 0 0 112 / 112 23.1s Picos_std_sync.Queue parallel ================================================================================ success (ran 2 tests) (cd _build/default/bench && ./main.exe -brief 'Picos Mutex') Picos Mutex: locked yields over time/1024 fibers: 0.18 M/s locked yields over time/1 fiber: 1.93 M/s locked yields over time/256 fibers: 0.14 M/s locked yields over time/2 domains: 0.07 M/s locked yields over time/2 fibers: 1.04 M/s locked yields over time/4 domains: 0.01 M/s locked yields over time/4 fibers: 0.74 M/s locked yields over time/512 fibers: 0.17 M/s locked yields over time/8 fibers: 0.22 M/s time per locked yield/1024 fibers: 5515.91 ns time per locked yield/1 fiber: 517.77 ns time per locked yield/256 fibers: 6951.73 ns time per locked yield/2 domains: 28628.53 ns time per locked yield/2 fibers: 958.11 ns time per locked yield/4 domains: 378195.54 ns time per locked yield/4 fibers: 1359.63 ns time per locked yield/512 fibers: 5770.33 ns time per locked yield/8 fibers: 4629.61 ns (cd _build/default/bench && ./main.exe -brief 'Picos Semaphore') Picos Semaphore: acquired yields over time/4 domains, 1 resource: 0.48 M/s acquired yields over time/4 domains, 2 resources: 1.12 M/s acquired yields over time/4 domains, 3 resources: 3.22 M/s acquired yields over time/4 domains, 4 resources: 4.69 M/s acquired yields over time/4 fibers, 1 resource: 0.62 M/s acquired yields over time/4 fibers, 2 resources: 0.64 M/s acquired yields over time/4 fibers, 3 resources: 0.91 M/s acquired yields over time/4 fibers, 4 resources: 1.56 M/s time per acquired yield/4 domains, 1 resource: 8374.33 ns time per acquired yield/4 domains, 2 resources: 3578.15 ns time per acquired yield/4 domains, 3 resources: 1243.37 ns time per acquired yield/4 domains, 4 resources: 853.18 ns time per acquired yield/4 fibers, 1 resource: 1603.60 ns time per acquired yield/4 fibers, 2 resources: 1573.54 ns time per acquired yield/4 fibers, 3 resources: 1100.36 ns time per acquired yield/4 fibers, 4 resources: 641.21 ns (cd _build/default/bench && ./main.exe -brief 'Picos Spawn') Picos Spawn: spawns over time/with packed computation: 3.66 M/s time per spawn/with packed computation: 273.30 ns (cd _build/default/bench && ./main.exe -brief 'Picos Queue') Picos Queue: messages over time/1 nb adder, 1 nb taker: 27.43 M/s messages over time/1 nb adder, 2 nb takers: 10.38 M/s messages over time/2 nb adders, 1 nb taker: 8.12 M/s messages over time/2 nb adders, 2 nb takers: 10.49 M/s messages over time/one domain: 13.68 M/s time per message/1 nb adder, 1 nb taker: 72.92 ns time per message/1 nb adder, 2 nb takers: 288.93 ns time per message/2 nb adders, 1 nb taker: 369.34 ns time per message/2 nb adders, 2 nb takers: 381.40 ns time per message/one domain: 73.12 ns (cd _build/default/bench && ./main.exe -brief 'Picos Yield') Picos Yield: time per yield/10000 fibers: 816.68 ns time per yield/1000 fibers: 197.21 ns time per yield/100 fibers: 141.44 ns time per yield/10 fibers: 221.51 ns time per yield/1 fiber: 245.58 ns yields over time/10000 fibers: 1.22 M/s yields over time/1000 fibers: 5.07 M/s yields over time/100 fibers: 7.07 M/s yields over time/10 fibers: 4.51 M/s yields over time/1 fiber: 4.07 M/s (cd _build/default/bench && ./main.exe -brief 'Picos Cancel_after with Picos_select') Picos Cancel_after with Picos_select: async round-trips over time/1 worker: 0.16 M/s async round-trips over time/2 workers: 0.31 M/s async round-trips over time/4 workers: 0.63 M/s round-trips over time/1 worker: 0.03 M/s round-trips over time/2 workers: 0.04 M/s round-trips over time/4 workers: 0.08 M/s time per async round-trip/1 worker: 6286.39 ns time per async round-trip/2 workers: 6457.92 ns time per async round-trip/4 workers: 6356.21 ns time per round-trip/1 worker: 30735.33 ns time per round-trip/2 workers: 47514.55 ns time per round-trip/4 workers: 51842.40 ns (cd _build/default/bench && ./main.exe -brief 'Ref with Picos_sync.Mutex') Ref with Picos_sync.Mutex: ops over time/cas int (checked): 6.96 M/s ops over time/cas int (unchecked): 22.81 M/s ops over time/get (checked): 7.43 M/s ops over time/get (unchecked): 20.99 M/s ops over time/incr (checked): 4.62 M/s ops over time/incr (unchecked): 20.31 M/s ops over time/push & pop (checked): 6.40 M/s ops over time/push & pop (unchecked): 17.74 M/s ops over time/swap (checked): 5.38 M/s ops over time/swap (unchecked): 22.18 M/s ops over time/xchg int (checked): 6.25 M/s ops over time/xchg int (unchecked): 23.32 M/s time per op/cas int (checked): 143.78 ns time per op/cas int (unchecked): 43.83 ns time per op/get (checked): 134.57 ns time per op/get (unchecked): 47.64 ns time per op/incr (checked): 216.65 ns time per op/incr (unchecked): 49.25 ns time per op/push & pop (checked): 156.23 ns time per op/push & pop (unchecked): 56.37 ns time per op/swap (checked): 185.89 ns time per op/swap (unchecked): 45.08 ns time per op/xchg int (checked): 159.99 ns time per op/xchg int (unchecked): 42.89 ns (cd _build/default/bench && ./main.exe -brief Picos_mpmcq) Picos_mpmcq: messages over time/1 nb adder, 1 nb taker: 6.16 M/s messages over time/1 nb adder, 2 nb takers: 6.21 M/s messages over time/2 nb adders, 1 nb taker: 6.46 M/s messages over time/2 nb adders, 2 nb takers: 6.35 M/s messages over time/one domain: 9.58 M/s time per message/1 nb adder, 1 nb taker: 324.53 ns time per message/1 nb adder, 2 nb takers: 483.43 ns time per message/2 nb adders, 1 nb taker: 464.42 ns time per message/2 nb adders, 2 nb takers: 629.53 ns time per message/one domain: 104.33 ns (cd _build/default/bench && ./main.exe -brief Picos_mpscq) Picos_mpscq: messages over time/1 nb adder, 1 nb taker: 18.11 M/s messages over time/2 nb adders, 1 nb taker: 11.27 M/s messages over time/one domain: 8.67 M/s time per message/1 nb adder, 1 nb taker: 110.45 ns time per message/2 nb adders, 1 nb taker: 266.19 ns time per message/one domain: 115.28 ns (cd _build/default/bench && ./main.exe -brief Picos_htbl) Picos_htbl: operations over time/1 worker, 10% reads: 12.84 M/s operations over time/1 worker, 50% reads: 15.67 M/s operations over time/1 worker, 90% reads: 20.57 M/s operations over time/2 workers, 10% reads: 15.94 M/s operations over time/2 workers, 50% reads: 22.74 M/s operations over time/2 workers, 90% reads: 33.96 M/s operations over time/4 workers, 10% reads: 28.78 M/s operations over time/4 workers, 50% reads: 33.43 M/s operations over time/4 workers, 90% reads: 54.44 M/s time per operation/1 worker, 10% reads: 77.87 ns time per operation/1 worker, 50% reads: 63.83 ns time per operation/1 worker, 90% reads: 48.62 ns time per operation/2 workers, 10% reads: 125.50 ns time per operation/2 workers, 50% reads: 87.94 ns time per operation/2 workers, 90% reads: 58.89 ns time per operation/4 workers, 10% reads: 138.99 ns time per operation/4 workers, 50% reads: 119.67 ns time per operation/4 workers, 90% reads: 73.47 ns (cd _build/default/bench && ./main.exe -brief Picos_stdio) Picos_stdio: blocking reads over time/1 worker: 0.02 M/s blocking reads over time/2 workers: 0.04 M/s blocking reads over time/4 workers: 0.17 M/s non-blocking reads over time/1 worker: 0.23 M/s non-blocking reads over time/2 workers: 0.42 M/s non-blocking reads over time/4 workers: 1.00 M/s time per blocking read/1 worker: 45262.43 ns time per blocking read/2 workers: 55005.87 ns time per blocking read/4 workers: 23554.96 ns time per non-blocking read/1 worker: 4257.39 ns time per non-blocking read/2 workers: 4759.15 ns time per non-blocking read/4 workers: 4009.32 ns (cd _build/default/bench && ./main.exe -brief 'Picos_sync Stream') Picos_sync Stream: messages over time/1 nb pusher, 1 nb reader: 4.16 M/s messages over time/2 nb pushers, 1 nb reader: 3.48 M/s messages over time/one domain: 3.99 M/s time per message/1 nb pusher, 1 nb reader: 480.79 ns time per message/2 nb pushers, 1 nb reader: 862.40 ns time per message/one domain: 250.54 ns (cd _build/default/bench && ./main.exe -brief Fib) Fib: spawns over time/1 mfifo, fib 20: 0.29 M/s spawns over time/1 rando, fib 20: 0.08 M/s spawns over time/2 mfifos, fib 20: 0.29 M/s spawns over time/2 randos, fib 20: 0.10 M/s spawns over time/4 mfifos, fib 20: 0.38 M/s spawns over time/4 randos, fib 20: 0.09 M/s time per spawn/1 mfifo, fib 20: 3441.43 ns time per spawn/1 rando, fib 20: 12800.64 ns time per spawn/2 mfifos, fib 20: 6788.52 ns time per spawn/2 randos, fib 20: 20738.74 ns time per spawn/4 mfifos, fib 20: 10420.76 ns time per spawn/4 randos, fib 20: 43649.58 ns (cd _build/default/bench && ./main.exe -brief 'Picos binaries') Picos binaries: binary size/picos: 84.02 kB binary size/picos.domain: 3.67 kB binary size/picos.thread: 3.11 kB binary size/picos_aux.htbl: 48.91 kB binary size/picos_aux.mpmcq: 15.29 kB binary size/picos_aux.mpscq: 17.79 kB binary size/picos_aux.rc: 16.11 kB binary size/picos_io: 107.63 kB binary size/picos_io.fd: 9.04 kB binary size/picos_io.select: 60.37 kB binary size/picos_io_cohttp: 44.20 kB binary size/picos_lwt: 25.12 kB binary size/picos_lwt.unix: 13.81 kB binary size/picos_mux.fifo: 25.21 kB binary size/picos_mux.multififo: 58.04 kB binary size/picos_mux.random: 46.78 kB binary size/picos_mux.thread: 21.28 kB binary size/picos_std.awaitable: 32.42 kB binary size/picos_std.event: 21.49 kB binary size/picos_std.finally: 18.12 kB binary size/picos_std.structured: 77.01 kB binary size/picos_std.sync: 125.21 kB (cd _build/default/bench && ./main.exe -brief 'Bounded_q with Picos_sync') Bounded_q with Picos_sync: messages over time/1 adder, 1 taker: 0.26 M/s messages over time/1 adder, 2 takers: 0.21 M/s messages over time/2 adders, 1 taker: 0.38 M/s messages over time/2 adders, 2 takers: 0.09 M/s messages over time/one domain: 9.34 M/s time per message/1 adder, 1 taker: 7605.44 ns time per message/1 adder, 2 takers: 14513.64 ns time per message/2 adders, 1 taker: 7872.95 ns time per message/2 adders, 2 takers: 44373.02 ns time per message/one domain: 107.12 ns (cd _build/default/bench && ./main.exe -brief 'Memory usage') Memory usage: stack and heap used/Fun.protect: 80.00 B stack and heap used/fiber in a bundle: 232.00 B stack and heap used/fiber in a flock: 248.00 B stack and heap used/fiber with shared computation & latch: 232.00 B stack and heap used/finally: 40.00 B stack and heap used/instantiate: 96.00 B stack and heap used/join_after bundle: 280.00 B stack and heap used/join_after flock: 280.00 B stack and heap used/lastly: 32.00 B stack and heap used/promise in a bundle: 352.00 B stack and heap used/promise in a flock: 368.00 B 2025-02-06 13:50.32 ---> saved as "be155dbecadc883c7606ce9226dbabfbd74771c061e40d1df5c52d99d9bf5b3a" Job succeeded 2025-02-06 13:50.32: Job succeeded