2025-08-04 10:20.31: New job: test ocaml-multicore/picos https://github.com/ocaml-multicore/picos.git#refs/heads/main (cd68168145df9b3b78eb1db6ecaa2e6634264c05) (windows-amd64:windows-server-2022-amd64-5.3_opam-2.4) Base: windows-server-2022-amd64-ocaml-5.3 Opam project build To reproduce locally: git clone --recursive "https://github.com/ocaml-multicore/picos.git" -b "main" && cd "picos" && git reset --hard cd681681 cat > Dockerfile <<'END-OF-DOCKERFILE' FROM windows-server-2022-amd64-ocaml-5.3 # windows-server-2022-amd64-5.3_opam-2.4 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" RUN ln -f /usr/bin/opam-2.4 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version RUN cd ~/opam-repository && (git cat-file -e 6129d098e5bdb58a83ef764c0af49db775a2987c || git fetch origin master) && git reset -q --hard 6129d098e5bdb58a83ef764c0af49db775a2987c && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 picos_std.opam picos_mux.opam picos_meta.opam picos_lwt.opam picos_io_cohttp.opam picos_io.opam picos_aux.opam picos.opam /Users/opam/src/./ RUN opam pin add -yn picos_std.dev '/Users/opam/src/./' && \ opam pin add -yn picos_mux.dev '/Users/opam/src/./' && \ opam pin add -yn picos_meta.dev '/Users/opam/src/./' && \ opam pin add -yn picos_lwt.dev '/Users/opam/src/./' && \ opam pin add -yn picos_io_cohttp.dev '/Users/opam/src/./' && \ opam pin add -yn picos_io.dev '/Users/opam/src/./' && \ opam pin add -yn picos_aux.dev '/Users/opam/src/./' && \ opam pin add -yn picos.dev '/Users/opam/src/./' RUN echo '(lang dune 3.0)' > '/cygdrive/c/Users/opam/src/./dune-project' ENV DEPS="alcotest.1.9.0 angstrom.0.16.1 arch-x86_64.1 astring.0.8.5 backoff.0.1.1 base.v0.17.3 base-bigarray.base base-bytes.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base base64.3.5.1 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 cohttp.6.1.1 cohttp-lwt.6.1.1 conf-mingw-w64-gcc-x86_64.1 containers.3.16 cppo.1.8.0 csexp.1.5.2 domain-local-await.1.0.1 domain-name.0.4.1 domain_shims.0.1.0 dscheck.0.5.0 dune.3.19.1 dune-configurator.3.19.1 either.1.0.0 flexdll.0.44 fmt.0.11.0 gen.1.1 host-arch-x86_64.1 host-system-mingw.1 http.6.1.1 ipaddr.5.6.1 js_of_ocaml.6.2.0 js_of_ocaml-compiler.6.2.0 logs.0.9.0 lwt.5.9.1 macaddr.5.6.1 mdx.2.5.0 menhir.20240715 menhirCST.20240715 menhirLib.20240715 menhirSdk.20240715 mingw-w64-shims.0.2.0 mtime.2.1.0 multicore-bench.0.1.7 multicore-magic.2.3.1 multicore-magic-dscheck.2.3.1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-env-mingw64.1 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.1 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 ocplib-endian.1.2 oseq.0.5.1 ppx_derivers.1.2.1 ppx_sexp_conv.v0.17.1 ppxlib.0.36.0 ppxlib_jane.v0.17.4 psq.0.2.1 qcheck-core.0.26 qcheck-multicoretests-util.0.9 qcheck-stm.0.9 re.1.13.2 result.1.5 sedlex.3.6 seq.base sexplib0.v0.17.0 stdlib-shims.0.3.0 stringext.1.6.0 system-mingw.1 thread-local-storage.0.2 thread-table.1.0.0 topkg.1.1.0 tsort.2.2.0 uri.4.4.0 uri-sexp.4.4.0 uutf.1.0.4 yojson.3.0.0" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.4 --depext-only -y picos_std.dev picos_mux.dev picos_meta.dev picos_lwt.dev picos_io_cohttp.dev picos_io.dev picos_aux.dev picos.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /Users/opam/src RUN cd /cygdrive/c/Users/opam/src && opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-08-04 10:20.31: Using cache hint "ocaml-multicore/picos-windows-server-2022-amd64-ocaml-5.3-windows-server-2022-amd64-5.3_opam-2.4-cb77d123ce3fef5f2a24878674255685" 2025-08-04 10:20.31: Using OBuilder spec: ((from windows-server-2022-amd64-ocaml-5.3) (comment windows-server-2022-amd64-5.3_opam-2.4) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (run (shell "ln -f /usr/bin/opam-2.4 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (run (cache (opam-archives (target "c:\\Users\\opam\\AppData\\local\\opam\\download-cache"))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 6129d098e5bdb58a83ef764c0af49db775a2987c || git fetch origin master) && git reset -q --hard 6129d098e5bdb58a83ef764c0af49db775a2987c && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src picos_std.opam picos_mux.opam picos_meta.opam picos_lwt.opam picos_io_cohttp.opam picos_io.opam picos_aux.opam picos.opam) (dst /Users/opam/src/./)) (run (network host) (shell "opam pin add -yn picos_std.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_mux.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_meta.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_lwt.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_io_cohttp.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_io.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_aux.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos.dev '/Users/opam/src/./'")) (run (network host) (shell "echo '(lang dune 3.0)' > '/cygdrive/c/Users/opam/src/./dune-project'")) (env DEPS "alcotest.1.9.0 angstrom.0.16.1 arch-x86_64.1 astring.0.8.5 backoff.0.1.1 base.v0.17.3 base-bigarray.base base-bytes.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base base64.3.5.1 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 cohttp.6.1.1 cohttp-lwt.6.1.1 conf-mingw-w64-gcc-x86_64.1 containers.3.16 cppo.1.8.0 csexp.1.5.2 domain-local-await.1.0.1 domain-name.0.4.1 domain_shims.0.1.0 dscheck.0.5.0 dune.3.19.1 dune-configurator.3.19.1 either.1.0.0 flexdll.0.44 fmt.0.11.0 gen.1.1 host-arch-x86_64.1 host-system-mingw.1 http.6.1.1 ipaddr.5.6.1 js_of_ocaml.6.2.0 js_of_ocaml-compiler.6.2.0 logs.0.9.0 lwt.5.9.1 macaddr.5.6.1 mdx.2.5.0 menhir.20240715 menhirCST.20240715 menhirLib.20240715 menhirSdk.20240715 mingw-w64-shims.0.2.0 mtime.2.1.0 multicore-bench.0.1.7 multicore-magic.2.3.1 multicore-magic-dscheck.2.3.1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-env-mingw64.1 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.1 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 ocplib-endian.1.2 oseq.0.5.1 ppx_derivers.1.2.1 ppx_sexp_conv.v0.17.1 ppxlib.0.36.0 ppxlib_jane.v0.17.4 psq.0.2.1 qcheck-core.0.26 qcheck-multicoretests-util.0.9 qcheck-stm.0.9 re.1.13.2 result.1.5 sedlex.3.6 seq.base sexplib0.v0.17.0 stdlib-shims.0.3.0 stringext.1.6.0 system-mingw.1 thread-local-storage.0.2 thread-table.1.0.0 topkg.1.1.0 tsort.2.2.0 uri.4.4.0 uri-sexp.4.4.0 uutf.1.0.4 yojson.3.0.0") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target "c:\\Users\\opam\\AppData\\local\\opam\\download-cache"))) (network host) (shell "opam update --depexts && opam install --cli=2.4 --depext-only -y picos_std.dev picos_mux.dev picos_meta.dev picos_lwt.dev picos_io_cohttp.dev picos_io.dev picos_aux.dev picos.dev $DEPS")) (run (cache (opam-archives (target "c:\\Users\\opam\\AppData\\local\\opam\\download-cache"))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /Users/opam/src)) (run (shell "cd /cygdrive/c/Users/opam/src && opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-08-04 10:20.31: Waiting for resource in pool OCluster 2025-08-04 10:20.31: Waiting for worker… 2025-08-04 11:06.26: Got resource from pool OCluster Building on thyme All commits already cached HEAD is now at cd68168 Inline operations in ref lock benchmark (from windows-server-2022-amd64-ocaml-5.3) 2025-08-04 11:06.26 ---> using "1d6ffeb5a52124df73b2e951323e99615c52ca4f9bdf0f65039be6146fabdaab" from cache /: (comment windows-server-2022-amd64-5.3_opam-2.4) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (run (shell "ln -f /usr/bin/opam-2.4 /usr/bin/opam")) 2025-08-04 11:06.26 ---> using "1e2d14b3062f84a7d6f6f169887d274097a3ceefc9654f3a38ec7d8793bb5864" from cache /: (run (shell "opam init --reinit -ni")) No configuration file found, using built-in defaults. <><> Unix support infrastructure ><><><><><><><><><><><><><><><><><><><><><><><> opam and the OCaml ecosystem in general require various Unix tools in order to operate correctly. At present, this requires the installation of Cygwin to provide these tools. How should opam obtain Unix tools? > 1. Use tools found in PATH (Cygwin installation at C:\cygwin64) 2. Automatically create an internal Cygwin installation that will be managed by opam (recommended) 3. Use Cygwin installation found in C:\cygwin64 4. Use another existing Cygwin/MSYS2 installation 5. Abort initialisation [1/2/3/4/5] 1 Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [default] no changes from file://C:/Users/opam/opam-repository 2025-08-04 11:06.26 ---> using "6dbdca3e6c2df003f6cebaa139aa227252a76195aef4f258913cc7da8ee795b2" from cache /: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) CYGWIN_NT-10.0-20348 3.6.4-1.x86_64 The OCaml toplevel, version 5.3.0 2.4.1 2025-08-04 11:06.26 ---> using "dd7e0e07f915e027f7b9342b70fd32de99a0178469368b94a3f981825bdf3d42" from cache /: (run (cache (opam-archives (target "c:\\Users\\opam\\AppData\\local\\opam\\download-cache"))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 6129d098e5bdb58a83ef764c0af49db775a2987c || git fetch origin master) && git reset -q --hard 6129d098e5bdb58a83ef764c0af49db775a2987c && git log --no-decorate -n1 --oneline && opam update -u")) From https://github.com/ocaml/opam-repository * branch master -> FETCH_HEAD f1d3c77350..6129d098e5 master -> origin/master 6129d098e5 Merge pull request #28274 from hhugo/release-js_of_ocaml-6.2.0 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [default] synchronised from file://C:/Users/opam/opam-repository Already up-to-date. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-08-04 11:06.26 ---> using "33c8a8591464c1381026712cf471fe002c624645247501eb71dfedb73ade8410" from cache /: (copy (src picos_std.opam picos_mux.opam picos_meta.opam picos_lwt.opam picos_io_cohttp.opam picos_io.opam picos_aux.opam picos.opam) (dst /Users/opam/src/./)) 2025-08-04 11:06.36 ---> saved as "9139450fa1a5187d1873afe3e67501e21267f12a91e0a952a00005f6f89ff197" /: (run (network host) (shell "opam pin add -yn picos_std.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_mux.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_meta.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_lwt.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_io_cohttp.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_io.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos_aux.dev '/Users/opam/src/./' && \ \nopam pin add -yn picos.dev '/Users/opam/src/./'")) [picos_std.dev] synchronised (file://C:/Users/opam/src/.) picos_std is now pinned to file://C:/Users/opam/src/. (version dev) [picos_mux.dev] synchronised (file://C:/Users/opam/src/.) picos_mux is now pinned to file://C:/Users/opam/src/. (version dev) [picos_meta.dev] synchronised (file://C:/Users/opam/src/.) picos_meta is now pinned to file://C:/Users/opam/src/. (version dev) [picos_lwt.dev] synchronised (file://C:/Users/opam/src/.) picos_lwt is now pinned to file://C:/Users/opam/src/. (version dev) [picos_io_cohttp.dev] synchronised (file://C:/Users/opam/src/.) picos_io_cohttp is now pinned to file://C:/Users/opam/src/. (version dev) [picos_io.dev] synchronised (file://C:/Users/opam/src/.) picos_io is now pinned to file://C:/Users/opam/src/. (version dev) [picos_aux.dev] synchronised (file://C:/Users/opam/src/.) picos_aux is now pinned to file://C:/Users/opam/src/. (version dev) [picos.dev] synchronised (file://C:/Users/opam/src/.) picos is now pinned to file://C:/Users/opam/src/. (version dev) 2025-08-04 11:07.35 ---> saved as "14a3f5b1c6902f724bdce0a7d1bf8559469dc7a7a5b1c7d6a7616691ee75dd59" /: (run (network host) (shell "echo '(lang dune 3.0)' > '/cygdrive/c/Users/opam/src/./dune-project'")) 2025-08-04 11:07.58 ---> saved as "3cab46205f72d323ad6837fed6a235e751732ea8b025ebaf3379097dc853e598" /: (env DEPS "alcotest.1.9.0 angstrom.0.16.1 arch-x86_64.1 astring.0.8.5 backoff.0.1.1 base.v0.17.3 base-bigarray.base base-bytes.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base base64.3.5.1 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 cohttp.6.1.1 cohttp-lwt.6.1.1 conf-mingw-w64-gcc-x86_64.1 containers.3.16 cppo.1.8.0 csexp.1.5.2 domain-local-await.1.0.1 domain-name.0.4.1 domain_shims.0.1.0 dscheck.0.5.0 dune.3.19.1 dune-configurator.3.19.1 either.1.0.0 flexdll.0.44 fmt.0.11.0 gen.1.1 host-arch-x86_64.1 host-system-mingw.1 http.6.1.1 ipaddr.5.6.1 js_of_ocaml.6.2.0 js_of_ocaml-compiler.6.2.0 logs.0.9.0 lwt.5.9.1 macaddr.5.6.1 mdx.2.5.0 menhir.20240715 menhirCST.20240715 menhirLib.20240715 menhirSdk.20240715 mingw-w64-shims.0.2.0 mtime.2.1.0 multicore-bench.0.1.7 multicore-magic.2.3.1 multicore-magic-dscheck.2.3.1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-env-mingw64.1 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.1 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 ocplib-endian.1.2 oseq.0.5.1 ppx_derivers.1.2.1 ppx_sexp_conv.v0.17.1 ppxlib.0.36.0 ppxlib_jane.v0.17.4 psq.0.2.1 qcheck-core.0.26 qcheck-multicoretests-util.0.9 qcheck-stm.0.9 re.1.13.2 result.1.5 sedlex.3.6 seq.base sexplib0.v0.17.0 stdlib-shims.0.3.0 stringext.1.6.0 system-mingw.1 thread-local-storage.0.2 thread-table.1.0.0 topkg.1.1.0 tsort.2.2.0 uri.4.4.0 uri-sexp.4.4.0 uutf.1.0.4 yojson.3.0.0") /: (env CI true) /: (env OCAMLCI true) /: (run (cache (opam-archives (target "c:\\Users\\opam\\AppData\\local\\opam\\download-cache"))) (network host) (shell "opam update --depexts && opam install --cli=2.4 --depext-only -y picos_std.dev picos_mux.dev picos_meta.dev picos_lwt.dev picos_io_cohttp.dev picos_io.dev picos_aux.dev picos.dev $DEPS")) <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [picos.dev] synchronised (file://C:/Users/opam/src/.) [picos_aux.dev] synchronised (file://C:/Users/opam/src/.) [picos_io.dev] synchronised (file://C:/Users/opam/src/.) [picos_io_cohttp.dev] synchronised (file://C:/Users/opam/src/.) [picos_lwt.dev] synchronised (file://C:/Users/opam/src/.) [picos_meta.dev] synchronised (file://C:/Users/opam/src/.) [picos_mux.dev] synchronised (file://C:/Users/opam/src/.) [picos_std.dev] synchronised (file://C:/Users/opam/src/.) [NOTE] Package system-mingw is already installed (current version is 1). [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-env-mingw64 is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package mingw-w64-shims is already installed (current version is 0.2.0). [NOTE] Package host-system-mingw is already installed (current version is 1). [NOTE] Package host-arch-x86_64 is already installed (current version is 1). [NOTE] Package flexdll is already installed (current version is 0.44). [NOTE] Package conf-mingw-w64-gcc-x86_64 is already installed (current version is 1). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). [NOTE] Package arch-x86_64 is already installed (current version is 1). 2025-08-04 11:09.44 ---> saved as "51d5a10a8e6d9f471b5ec4165ad1f405edc4e543a985024ce51b15706e1b9acd" /: (run (cache (opam-archives (target "c:\\Users\\opam\\AppData\\local\\opam\\download-cache"))) (network host) (shell "opam install $DEPS")) [NOTE] Package system-mingw is already installed (current version is 1). [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-env-mingw64 is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package mingw-w64-shims is already installed (current version is 0.2.0). [NOTE] Package host-system-mingw is already installed (current version is 1). [NOTE] Package host-arch-x86_64 is already installed (current version is 1). [NOTE] Package flexdll is already installed (current version is 0.44). [NOTE] Package conf-mingw-w64-gcc-x86_64 is already installed (current version is 1). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). [NOTE] Package arch-x86_64 is already installed (current version is 1). The following actions will be performed: === install 71 packages - install alcotest 1.9.0 - install angstrom 0.16.1 - install astring 0.8.5 - install backoff 0.1.1 - install base v0.17.3 - install base-bytes base - install base64 3.5.1 - install bigstringaf 0.10.0 - install camlp-streams 5.0.1 - install cmdliner 1.3.0 - install cohttp 6.1.1 - install cohttp-lwt 6.1.1 - install containers 3.16 - install cppo 1.8.0 - install csexp 1.5.2 - install domain-local-await 1.0.1 - install domain-name 0.4.1 - install domain_shims 0.1.0 - install dscheck 0.5.0 - install dune 3.19.1 - install dune-configurator 3.19.1 - install either 1.0.0 - install fmt 0.11.0 - install gen 1.1 - install http 6.1.1 - install ipaddr 5.6.1 - install js_of_ocaml 6.2.0 - install js_of_ocaml-compiler 6.2.0 - install logs 0.9.0 - install lwt 5.9.1 - install macaddr 5.6.1 - install mdx 2.5.0 - install menhir 20240715 - install menhirCST 20240715 - install menhirLib 20240715 - install menhirSdk 20240715 - install mtime 2.1.0 - install multicore-bench 0.1.7 - install multicore-magic 2.3.1 - install multicore-magic-dscheck 2.3.1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml-version 4.0.1 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.16.1 - install ocamlfind 1.9.8 - install ocplib-endian 1.2 - install oseq 0.5.1 - install ppx_derivers 1.2.1 - install ppx_sexp_conv v0.17.1 - install ppxlib 0.36.0 - install ppxlib_jane v0.17.4 - install psq 0.2.1 - install qcheck-core 0.26 - install qcheck-multicoretests-util 0.9 - install qcheck-stm 0.9 - install re 1.13.2 - install result 1.5 - install sedlex 3.6 - install seq base - install sexplib0 v0.17.0 - install stdlib-shims 0.3.0 - install stringext 1.6.0 - install thread-local-storage 0.2 - install thread-table 1.0.0 - install topkg 1.1.0 - install tsort 2.2.0 - install uri 4.4.0 - install uri-sexp 4.4.0 - install uutf 1.0.4 - install yojson 3.0.0 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved alcotest.1.9.0 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved astring.0.8.5 (cached) -> retrieved backoff.0.1.1 (cached) -> retrieved base64.3.5.1 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved camlp-streams.5.0.1 (cached) -> retrieved cmdliner.1.3.0 (cached) -> retrieved cohttp.6.1.1, cohttp-lwt.6.1.1, http.6.1.1 (https://github.com/mirage/ocaml-cohttp/releases/download/v6.1.1/cohttp-6.1.1.tbz) -> retrieved base.v0.17.3 (https://github.com/janestreet/base/archive/refs/tags/v0.17.3.tar.gz) -> retrieved containers.3.16 (https://github.com/c-cube/ocaml-containers/releases/download/v3.16/containers-3.16.tbz) -> retrieved cppo.1.8.0 (cached) -> retrieved csexp.1.5.2 (cached) -> retrieved domain-local-await.1.0.1 (cached) -> retrieved domain-name.0.4.1 (cached) -> retrieved domain_shims.0.1.0 (cached) -> retrieved dscheck.0.5.0 (cached) -> retrieved either.1.0.0 (cached) -> retrieved gen.1.1 (cached) -> retrieved fmt.0.11.0 (https://erratique.ch/software/fmt/releases/fmt-0.11.0.tbz) -> retrieved ipaddr.5.6.1, macaddr.5.6.1 (https://github.com/mirage/ocaml-ipaddr/releases/download/v5.6.1/ipaddr-5.6.1.tbz) -> retrieved logs.0.9.0 (https://erratique.ch/software/logs/releases/logs-0.9.0.tbz) -> retrieved lwt.5.9.1 (cached) -> retrieved mdx.2.5.0 (cached) -> retrieved menhir.20240715, menhirCST.20240715, menhirLib.20240715, menhirSdk.20240715 (cached) -> retrieved dune.3.19.1, dune-configurator.3.19.1 (https://github.com/ocaml/dune/releases/download/3.19.1/dune-3.19.1.tbz) -> retrieved js_of_ocaml.6.2.0, js_of_ocaml-compiler.6.2.0 (https://github.com/ocsigen/js_of_ocaml/releases/download/6.2.0/js_of_ocaml-6.2.0.tbz) -> retrieved mtime.2.1.0 (cached) -> retrieved multicore-bench.0.1.7 (cached) -> retrieved multicore-magic.2.3.1, multicore-magic-dscheck.2.3.1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ocaml-version.4.0.1 (https://github.com/ocurrent/ocaml-version/releases/download/v4.0.1/ocaml-version-4.0.1.tbz) -> retrieved ocamlbuild.0.16.1 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved ocplib-endian.1.2 (cached) -> retrieved oseq.0.5.1 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved ppxlib.0.36.0 (cached) -> retrieved ppx_sexp_conv.v0.17.1 (https://github.com/janestreet/ppx_sexp_conv/archive/refs/tags/v0.17.1.tar.gz) -> retrieved ppxlib_jane.v0.17.4 (https://github.com/janestreet/ppxlib_jane/archive/refs/tags/v0.17.4.tar.gz) -> retrieved psq.0.2.1 (cached) -> retrieved re.1.13.2 (https://github.com/ocaml/ocaml-re/archive/refs/tags/1.13.2.tar.gz) -> retrieved qcheck-core.0.26 (https://github.com/c-cube/qcheck/archive/v0.26.tar.gz) -> retrieved qcheck-multicoretests-util.0.9, qcheck-stm.0.9 (https://github.com/ocaml-multicore/multicoretests/archive/refs/tags/0.9.tar.gz) -> retrieved seq.base (cached) -> retrieved result.1.5 (cached) -> retrieved sexplib0.v0.17.0 (cached) -> retrieved sedlex.3.6 (https://github.com/ocaml-community/sedlex/archive/refs/tags/v3.6.tar.gz) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved stringext.1.6.0 (cached) -> retrieved thread-local-storage.0.2 (cached) -> retrieved thread-table.1.0.0 (cached) -> retrieved tsort.2.2.0 (cached) -> retrieved topkg.1.1.0 (https://erratique.ch/software/topkg/releases/topkg-1.1.0.tbz) -> retrieved uri.4.4.0, uri-sexp.4.4.0 (cached) -> retrieved uutf.1.0.4 (cached) -> installed cmdliner.1.3.0 -> installed seq.base -> retrieved yojson.3.0.0 (https://github.com/ocaml-community/yojson/releases/download/3.0.0/yojson-3.0.0.tbz) [WARNING] .install file is missing .exe extension for src/findlib/ocamlfind [WARNING] .install file is missing .exe extension for src/findlib/ocamlfind_opt [WARNING] Automatically adding .exe to C:\Users\opam\AppData\Local\opam\5.3.0\.opam-switch\build\ocamlfind.1.9.8\src\findlib\ocamlfind.exe [WARNING] Automatically adding .exe to C:\Users\opam\AppData\Local\opam\5.3.0\.opam-switch\build\ocamlfind.1.9.8\src\findlib\ocamlfind_opt.exe [WARNING] C:\Users\opam\AppData\Local\opam\5.3.0\bin\safe_camlp4 is a script; the command won't be available -> installed ocamlfind.1.9.8 -> installed base-bytes.base -> installed ocamlbuild.0.16.1 -> installed topkg.1.1.0 -> installed uutf.1.0.4 -> installed astring.0.8.5 -> installed dune.3.19.1 -> installed fmt.0.11.0 -> installed backoff.0.1.1 -> installed camlp-streams.5.0.1 -> installed csexp.1.5.2 -> installed domain-name.0.4.1 -> installed base64.3.5.1 -> installed cppo.1.8.0 -> installed domain_shims.0.1.0 -> installed either.1.0.0 -> installed gen.1.1 -> installed http.6.1.1 -> installed macaddr.5.6.1 -> installed menhirCST.20240715 -> installed mtime.2.1.0 -> installed menhirSdk.20240715 -> installed multicore-magic.2.3.1 -> installed ipaddr.5.6.1 -> installed menhirLib.20240715 -> installed ocaml-syntax-shims.1.0.0 -> installed ocaml-version.4.0.1 -> installed dune-configurator.3.19.1 -> installed ocaml-compiler-libs.v0.17.0 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed ocplib-endian.1.2 -> installed oseq.0.5.1 -> installed ppx_derivers.1.2.1 -> installed psq.0.2.1 -> installed bigstringaf.0.10.0 -> installed result.1.5 -> installed qcheck-core.0.26 -> installed stdlib-shims.0.3.0 -> installed angstrom.0.16.1 -> installed re.1.13.2 -> installed qcheck-multicoretests-util.0.9 -> installed sexplib0.v0.17.0 -> installed stringext.1.6.0 -> installed containers.3.16 -> installed qcheck-stm.0.9 -> installed thread-local-storage.0.2 -> installed alcotest.1.9.0 -> installed thread-table.1.0.0 -> installed tsort.2.2.0 -> installed uri.4.4.0 -> installed domain-local-await.1.0.1 -> installed lwt.5.9.1 -> installed dscheck.0.5.0 -> installed yojson.3.0.0 -> installed multicore-magic-dscheck.2.3.1 -> installed multicore-bench.0.1.7 -> installed base.v0.17.3 -> installed ppxlib.0.36.0 -> installed ppxlib_jane.v0.17.4 -> installed menhir.20240715 -> installed sedlex.3.6 -> installed ppx_sexp_conv.v0.17.1 -> installed uri-sexp.4.4.0 -> installed js_of_ocaml-compiler.6.2.0 -> installed logs.0.9.0 -> installed mdx.2.5.0 -> installed cohttp.6.1.1 -> installed js_of_ocaml.6.2.0 -> installed cohttp-lwt.6.1.1 Done. # To update the current shell environment, run: eval $(opam env) 2025-08-04 11:22.49 ---> saved as "a876942814cca8f73a4bd3956104df33b87f81814309b434606ac1b475bc5630" /: (copy (src .) (dst /Users/opam/src)) 2025-08-04 11:23.37 ---> saved as "7e8a4d2194982b0317ac73266b4f8797817982143c60c4fd564a82338bb89582" /: (run (shell "cd /cygdrive/c/Users/opam/src && opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test && .\test_picos_dscheck.exe) Testing `Picos DSCheck'. This run has ID `7R84EKQJ'. [OK] Trigger 0 basic contract. [OK] Computation 0 basic contract. [OK] Computation 1 removes triggers. Full test results in `~\src\_build\default\test\_build\_tests\7R84EKQJ'. Test Successful in 0.985s. 3 tests run. (cd _build/default/test && .\test_mpmcq_dscheck.exe) Testing `Picos_mpmcq DSCheck'. This run has ID `TN901SQO'. [OK] Multiple pushes and pops 0 Full test results in `~\src\_build\default\test\_build\_tests\TN901SQO'. Test Successful in 2.406s. 1 test run. (cd _build/default/test && .\test_io_with_lwt.exe) Testing `Picos_io_with_lwt'. This run has ID `XKGHD39Q'. Full test results in `~\src\_build\default\test\_build\_tests\XKGHD39Q'. Test Successful in 0.000s. 0 test run. (cd _build/default/test && .\test_finally.exe) Testing `Picos_finally'. This run has ID `AOZTVCKC'. [OK] move 0 is lazy. [OK] borrow 0 returns resource. Full test results in `~\src\_build\default\test\_build\_tests\AOZTVCKC'. Test Successful in 0.125s. 2 tests run. (cd _build/default/example && .\guards.exe) Testing with scheduler: multififos ~quota:44 ~n_domains:8 Ran guarded case statement examples. (cd _build/default/test && .\test_io.exe) Testing `Picos_io'. This run has ID `H1DN6GG6'. [OK] Unix 0 openfile and read. [OK] Unix 1 sleepf. [OK] Unix 2 select empty timeout. [OK] Unix 3 select empty ∞. [OK] Unix 4 select. Full test results in `~\src\_build\default\test\_build\_tests\H1DN6GG6'. Test Successful in 0.501s. 5 tests run. (cd _build/default/test && .\test_lwt_unix.exe) Testing `Picos_lwt'. This run has ID `GTCR6IED'. [OK] Basics 0 Full test results in `~\src\_build\default\test\_build\_tests\GTCR6IED'. Test Successful in 0.063s. 1 test run. (cd _build/default/test && .\test_select.exe) Testing `Picos_select'. This run has ID `MY1LBCFS'. Full test results in `~\src\_build\default\test\_build\_tests\MY1LBCFS'. Test Successful in 0.000s. 0 test run. (cd _build/default/test && .\test_server_and_client.exe) Using non-blocking sockets and fibers on OCaml 5: Looping server running Server listening Client B running Server accepting Client A running Client B connected Client A connected Client B wrote 100 Client A wrote 100 Server accepted client Server accepting Server read 100 Server accepted client Server wrote 50 Server read 100 Server accepting Server wrote 50 Client A read 50 Client B read 50 Server and Client test: OK (cd _build/default/test && .\test_picos.exe) Testing `Picos'. This run has ID `800A83WR'. [OK] Trigger 0 basics. [OK] Computation 0 basics. [OK] Computation 1 tx. [OK] Computation 2 signals in order. [OK] Fiber.FLS 0 basics. [OK] Cancel 0 [OK] Cancel after 0 Full test results in `~\src\_build\default\test\_build\_tests\800A83WR'. Test Successful in 60.375s. 7 tests run. (cd _build/default/test && .\test_lock.exe) random seed: 2272695533225697826 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Lock sequential [✓] 32 0 0 32 / 32 0.0s Lock sequential [ ] 0 0 0 0 / 32 0.0s Lock parallel [ ] 1 0 0 1 / 32 0.5s Lock parallel [ ] 2 0 0 2 / 32 1.0s Lock parallel [ ] 3 0 0 3 / 32 1.8s Lock parallel [ ] 3 0 0 3 / 32 2.1s Lock parallel (generating) [ ] 4 0 0 4 / 32 2.6s Lock parallel [ ] 5 0 0 5 / 32 3.1s Lock parallel [ ] 6 0 0 6 / 32 4.4s Lock parallel [ ] 7 0 0 7 / 32 5.4s Lock parallel [ ] 8 0 0 8 / 32 5.8s Lock parallel [ ] 9 0 0 9 / 32 6.1s Lock parallel [ ] 10 0 0 10 / 32 7.0s Lock parallel [ ] 11 0 0 11 / 32 7.4s Lock parallel [ ] 12 0 0 12 / 32 8.1s Lock parallel [ ] 13 0 0 13 / 32 8.8s Lock parallel [ ] 14 0 0 14 / 32 9.6s Lock parallel [ ] 15 0 0 15 / 32 10.0s Lock parallel [ ] 16 0 0 16 / 32 11.0s Lock parallel [ ] 17 0 0 17 / 32 11.9s Lock parallel [ ] 18 0 0 18 / 32 12.9s Lock parallel [ ] 19 0 0 19 / 32 13.3s Lock parallel [ ] 20 0 0 20 / 32 14.1s Lock parallel [ ] 22 0 0 22 / 32 14.6s Lock parallel [ ] 23 0 0 23 / 32 14.8s Lock parallel [ ] 24 0 0 24 / 32 15.4s Lock parallel [ ] 25 0 0 25 / 32 15.8s Lock parallel [ ] 26 0 0 26 / 32 16.0s Lock parallel [ ] 27 0 0 27 / 32 16.6s Lock parallel [ ] 28 0 0 28 / 32 17.0s Lock parallel [ ] 30 0 0 30 / 32 17.3s Lock parallel [ ] 31 0 0 31 / 32 17.6s Lock parallel [ ] 31 0 0 31 / 32 17.9s Lock parallel (generating) [ ] 32 0 0 32 / 32 18.3s Lock parallel [✓] 32 0 0 32 / 32 18.3s Lock parallel ================================================================================ success (ran 2 tests) random seed: 352168791250033339 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Lock sequential [✓] 64 0 0 64 / 64 0.0s Lock sequential [ ] 0 0 0 0 / 64 0.0s Lock parallel [ ] 1 0 0 1 / 64 0.3s Lock parallel [ ] 2 0 0 2 / 64 0.6s Lock parallel [ ] 3 0 0 3 / 64 0.8s Lock parallel [ ] 5 0 0 5 / 64 1.0s Lock parallel [ ] 7 0 0 7 / 64 1.6s Lock parallel [ ] 8 0 0 8 / 64 2.5s Lock parallel [ ] 9 0 0 9 / 64 3.2s Lock parallel [ ] 10 0 0 10 / 64 3.5s Lock parallel [ ] 11 0 0 11 / 64 4.1s Lock parallel [ ] 12 0 0 12 / 64 4.5s Lock parallel [ ] 13 0 0 13 / 64 5.5s Lock parallel [ ] 14 0 0 14 / 64 6.0s Lock parallel [ ] 15 0 0 15 / 64 6.5s Lock parallel [ ] 16 0 0 16 / 64 6.9s Lock parallel [ ] 18 0 0 18 / 64 7.8s Lock parallel [ ] 19 0 0 19 / 64 8.2s Lock parallel [ ] 20 0 0 20 / 64 8.7s Lock parallel [ ] 21 0 0 21 / 64 9.0s Lock parallel [ ] 22 0 0 22 / 64 9.7s Lock parallel [ ] 23 0 0 23 / 64 9.9s Lock parallel [ ] 24 0 0 24 / 64 10.6s Lock parallel [ ] 25 0 0 25 / 64 11.3s Lock parallel [ ] 26 0 0 26 / 64 12.1s Lock parallel [ ] 27 0 0 27 / 64 12.4s Lock parallel [ ] 28 0 0 28 / 64 13.5s Lock parallel [ ] 29 0 0 29 / 64 14.3s Lock parallel [ ] 30 0 0 30 / 64 14.6s Lock parallel [ ] 31 0 0 31 / 64 15.6s Lock parallel [ ] 32 0 0 32 / 64 16.6s Lock parallel [ ] 33 0 0 33 / 64 17.6s Lock parallel [ ] 34 0 0 34 / 64 18.8s Lock parallel [ ] 35 0 0 35 / 64 20.1s Lock parallel [ ] 36 0 0 36 / 64 21.0s Lock parallel [ ] 37 0 0 37 / 64 21.8s Lock parallel [ ] 38 0 0 38 / 64 22.9s Lock parallel [ ] 39 0 0 39 / 64 24.2s Lock parallel [ ] 40 0 0 40 / 64 25.7s Lock parallel [ ] 41 0 0 41 / 64 26.6s Lock parallel [ ] 42 0 0 42 / 64 27.3s Lock parallel [ ] 43 0 0 43 / 64 28.1s Lock parallel [ ] 44 0 0 44 / 64 29.3s Lock parallel [ ] 45 0 0 45 / 64 30.3s Lock parallel [ ] 46 0 0 46 / 64 30.8s Lock parallel [ ] 47 0 0 47 / 64 31.8s Lock parallel [ ] 48 0 0 48 / 64 32.9s Lock parallel [ ] 49 0 0 49 / 64 34.1s Lock parallel [ ] 50 0 0 50 / 64 34.8s Lock parallel [ ] 51 0 0 51 / 64 35.6s Lock parallel [ ] 52 0 0 52 / 64 36.0s Lock parallel [ ] 53 0 0 53 / 64 37.2s Lock parallel [ ] 54 0 0 54 / 64 38.1s Lock parallel [ ] 55 0 0 55 / 64 38.4s Lock parallel [ ] 56 0 0 56 / 64 38.5s Lock parallel [ ] 57 0 0 57 / 64 39.0s Lock parallel [ ] 58 0 0 58 / 64 39.2s Lock parallel [ ] 59 0 0 59 / 64 39.6s Lock parallel [ ] 60 0 0 60 / 64 40.2s Lock parallel [ ] 61 0 0 61 / 64 41.0s Lock parallel [ ] 62 0 0 62 / 64 42.9s Lock parallel [ ] 63 0 0 63 / 64 43.0s Lock parallel [ ] 64 0 0 64 / 64 43.9s Lock parallel [✓] 64 0 0 64 / 64 43.9s Lock parallel ================================================================================ success (ran 2 tests) (cd _build/default/test && .\test_structured.exe) Testing `Picos_structured'. This run has ID `95CRX2GP'. [OK] Bundle 0 fork after terminate. [OK] Bundle 1 fork after escape. [OK] Bundle 2 exception in child terminates. [OK] Bundle 3 cancelation awaits children. [OK] Bundle 4 block raises when forbidden. [OK] Bundle 5 block raises Sys_error when fiber finishes. [OK] Bundle 6 termination nests. [OK] Bundle 7 promise cancelation does not terminate. [OK] Bundle 8 error in promise terminates. [OK] Bundle 9 can wait promises. [OK] Bundle 10 can select promises. [OK] Run 0 any and all errors. [OK] Run 1 any and all returns. [OK] Run 2 race any. Full test results in `~\src\_build\default\test\_build\_tests\95CRX2GP'. Test Successful in 2.109s. 14 tests run. (cd _build/default/test && .\test_sync.exe -- "^Mutex and Condition$" 0) Testing `Picos_sync'. This run has ID `AKBZLAFZ'. [OK] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\AKBZLAFZ'. Test Successful in 0.016s. 1 test run. (cd _build/default/test && .\test_mutex.exe) random seed: 2070039740220683092 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Mutex sequential [✓] 32 0 0 32 / 32 0.0s Mutex sequential [ ] 0 0 0 0 / 32 0.0s Mutex parallel [ ] 1 0 0 1 / 32 0.8s Mutex parallel [ ] 1 0 0 1 / 32 1.0s Mutex parallel (generating) [ ] 2 0 0 2 / 32 1.6s Mutex parallel [ ] 3 0 0 3 / 32 2.8s Mutex parallel [ ] 4 0 0 4 / 32 3.6s Mutex parallel [ ] 5 0 0 5 / 32 4.3s Mutex parallel [ ] 6 0 0 6 / 32 4.7s Mutex parallel [ ] 7 0 0 7 / 32 5.4s Mutex parallel [ ] 8 0 0 8 / 32 6.0s Mutex parallel [ ] 9 0 0 9 / 32 6.4s Mutex parallel [ ] 10 0 0 10 / 32 7.3s Mutex parallel [ ] 11 0 0 11 / 32 7.8s Mutex parallel [ ] 12 0 0 12 / 32 8.4s Mutex parallel [ ] 13 0 0 13 / 32 8.9s Mutex parallel [ ] 14 0 0 14 / 32 9.9s Mutex parallel [ ] 15 0 0 15 / 32 10.8s Mutex parallel [ ] 16 0 0 16 / 32 11.8s Mutex parallel [ ] 17 0 0 17 / 32 12.2s Mutex parallel [ ] 18 0 0 18 / 32 13.1s Mutex parallel [ ] 20 0 0 20 / 32 13.6s Mutex parallel [ ] 21 0 0 21 / 32 13.7s Mutex parallel [ ] 22 0 0 22 / 32 13.9s Mutex parallel [ ] 23 0 0 23 / 32 14.5s Mutex parallel [ ] 24 0 0 24 / 32 14.8s Mutex parallel [ ] 25 0 0 25 / 32 14.9s Mutex parallel [ ] 26 0 0 26 / 32 15.7s Mutex parallel [ ] 27 0 0 27 / 32 16.0s Mutex parallel [ ] 28 0 0 28 / 32 16.4s Mutex parallel [ ] 29 0 0 29 / 32 16.6s Mutex parallel [ ] 29 0 0 29 / 32 16.8s Mutex parallel (generating) [ ] 30 0 0 30 / 32 17.2s Mutex parallel [ ] 31 0 0 31 / 32 17.5s Mutex parallel [ ] 32 0 0 32 / 32 17.8s Mutex parallel [✓] 32 0 0 32 / 32 17.8s Mutex parallel ================================================================================ success (ran 2 tests) random seed: 1798940960418826576 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Mutex sequential [✓] 64 0 0 64 / 64 0.0s Mutex sequential [ ] 0 0 0 0 / 64 0.0s Mutex parallel [ ] 1 0 0 1 / 64 0.2s Mutex parallel [ ] 2 0 0 2 / 64 0.4s Mutex parallel [ ] 4 0 0 4 / 64 0.8s Mutex parallel [ ] 5 0 0 5 / 64 1.9s Mutex parallel [ ] 6 0 0 6 / 64 2.6s Mutex parallel [ ] 7 0 0 7 / 64 2.9s Mutex parallel [ ] 8 0 0 8 / 64 3.5s Mutex parallel [ ] 9 0 0 9 / 64 3.8s Mutex parallel [ ] 10 0 0 10 / 64 4.9s Mutex parallel [ ] 11 0 0 11 / 64 5.5s Mutex parallel [ ] 12 0 0 12 / 64 6.2s Mutex parallel [ ] 13 0 0 13 / 64 6.4s Mutex parallel [ ] 14 0 0 14 / 64 6.5s Mutex parallel [ ] 15 0 0 15 / 64 7.3s Mutex parallel [ ] 16 0 0 16 / 64 7.7s Mutex parallel [ ] 17 0 0 17 / 64 8.4s Mutex parallel [ ] 18 0 0 18 / 64 8.8s Mutex parallel [ ] 19 0 0 19 / 64 9.2s Mutex parallel [ ] 20 0 0 20 / 64 10.0s Mutex parallel [ ] 21 0 0 21 / 64 10.5s Mutex parallel [ ] 22 0 0 22 / 64 11.5s Mutex parallel [ ] 23 0 0 23 / 64 12.1s Mutex parallel [ ] 24 0 0 24 / 64 12.9s Mutex parallel [ ] 25 0 0 25 / 64 13.7s Mutex parallel [ ] 26 0 0 26 / 64 14.5s Mutex parallel [ ] 27 0 0 27 / 64 15.3s Mutex parallel [ ] 28 0 0 28 / 64 16.7s Mutex parallel [ ] 29 0 0 29 / 64 17.3s Mutex parallel [ ] 30 0 0 30 / 64 17.9s Mutex parallel [ ] 31 0 0 31 / 64 19.5s Mutex parallel [ ] 32 0 0 32 / 64 20.4s Mutex parallel [ ] 33 0 0 33 / 64 21.2s Mutex parallel [ ] 34 0 0 34 / 64 22.3s Mutex parallel [ ] 35 0 0 35 / 64 23.2s Mutex parallel [ ] 36 0 0 36 / 64 24.9s Mutex parallel [ ] 37 0 0 37 / 64 25.9s Mutex parallel [ ] 38 0 0 38 / 64 26.4s Mutex parallel [ ] 39 0 0 39 / 64 27.4s Mutex parallel [ ] 40 0 0 40 / 64 27.9s Mutex parallel [ ] 41 0 0 41 / 64 29.2s Mutex parallel [ ] 42 0 0 42 / 64 29.7s Mutex parallel [ ] 43 0 0 43 / 64 30.4s Mutex parallel [ ] 44 0 0 44 / 64 31.4s Mutex parallel [ ] 45 0 0 45 / 64 32.6s Mutex parallel [ ] 46 0 0 46 / 64 33.5s Mutex parallel [ ] 47 0 0 47 / 64 34.2s Mutex parallel [ ] 48 0 0 48 / 64 35.0s Mutex parallel [ ] 49 0 0 49 / 64 35.4s Mutex parallel [ ] 50 0 0 50 / 64 36.7s Mutex parallel [ ] 51 0 0 51 / 64 37.6s Mutex parallel [ ] 52 0 0 52 / 64 37.8s Mutex parallel [ ] 53 0 0 53 / 64 38.0s Mutex parallel [ ] 54 0 0 54 / 64 38.4s Mutex parallel [ ] 55 0 0 55 / 64 38.5s Mutex parallel [ ] 56 0 0 56 / 64 39.0s Mutex parallel [ ] 57 0 0 57 / 64 39.6s Mutex parallel [ ] 58 0 0 58 / 64 40.6s Mutex parallel [ ] 59 0 0 59 / 64 42.3s Mutex parallel [ ] 60 0 0 60 / 64 42.5s Mutex parallel [ ] 61 0 0 61 / 64 43.3s Mutex parallel [ ] 62 0 0 62 / 64 44.4s Mutex parallel [ ] 63 0 0 63 / 64 45.4s Mutex parallel [ ] 64 0 0 64 / 64 46.1s Mutex parallel [✓] 64 0 0 64 / 64 46.1s Mutex parallel ================================================================================ success (ran 2 tests) (cd _build/default/test && .\test_htbl.exe) random seed: 3149810349947428524 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Htbl sequential [✓] 32 0 0 32 / 32 0.0s Htbl sequential [ ] 0 0 0 0 / 32 0.0s Htbl parallel [ ] 1 0 0 1 / 32 0.1s Htbl parallel [ ] 2 0 0 2 / 32 0.3s Htbl parallel [ ] 3 0 0 3 / 32 0.5s Htbl parallel [ ] 4 0 0 4 / 32 0.9s Htbl parallel [ ] 5 0 0 5 / 32 1.2s Htbl parallel [ ] 6 0 0 6 / 32 1.5s Htbl parallel [ ] 7 0 0 7 / 32 1.7s Htbl parallel [ ] 8 0 0 8 / 32 1.8s Htbl parallel [ ] 9 0 0 9 / 32 1.9s Htbl parallel [ ] 11 0 0 11 / 32 2.2s Htbl parallel [ ] 13 0 0 13 / 32 2.4s Htbl parallel [ ] 17 0 0 17 / 32 2.5s Htbl parallel [ ] 20 0 0 20 / 32 2.8s Htbl parallel [ ] 22 0 0 22 / 32 3.0s Htbl parallel [ ] 23 0 0 23 / 32 3.3s Htbl parallel [ ] 24 0 0 24 / 32 3.5s Htbl parallel [ ] 26 0 0 26 / 32 3.7s Htbl parallel [ ] 27 0 0 27 / 32 3.9s Htbl parallel [ ] 28 0 0 28 / 32 4.0s Htbl parallel [ ] 30 0 0 30 / 32 4.3s Htbl parallel [ ] 31 0 0 31 / 32 4.5s Htbl parallel [✓] 32 0 0 32 / 32 4.6s Htbl parallel ================================================================================ success (ran 2 tests) random seed: 4586859127345094930 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Htbl sequential [✓] 64 0 0 64 / 64 0.0s Htbl sequential [ ] 0 0 0 0 / 64 0.0s Htbl parallel [ ] 3 0 0 3 / 64 0.2s Htbl parallel [ ] 5 0 0 5 / 64 0.3s Htbl parallel [ ] 7 0 0 7 / 64 0.5s Htbl parallel [ ] 9 0 0 9 / 64 0.6s Htbl parallel [ ] 11 0 0 11 / 64 0.7s Htbl parallel [ ] 13 0 0 13 / 64 0.9s Htbl parallel [ ] 16 0 0 16 / 64 1.0s Htbl parallel [ ] 19 0 0 19 / 64 1.3s Htbl parallel [ ] 21 0 0 21 / 64 1.5s Htbl parallel [ ] 23 0 0 23 / 64 1.6s Htbl parallel [ ] 25 0 0 25 / 64 1.8s Htbl parallel [ ] 27 0 0 27 / 64 1.9s Htbl parallel [ ] 28 0 0 28 / 64 2.0s Htbl parallel [ ] 30 0 0 30 / 64 2.1s Htbl parallel [ ] 32 0 0 32 / 64 2.3s Htbl parallel [ ] 34 0 0 34 / 64 2.5s Htbl parallel [ ] 35 0 0 35 / 64 2.6s Htbl parallel [ ] 36 0 0 36 / 64 2.8s Htbl parallel [ ] 38 0 0 38 / 64 3.2s Htbl parallel [ ] 39 0 0 39 / 64 3.3s Htbl parallel [ ] 40 0 0 40 / 64 3.5s Htbl parallel [ ] 41 0 0 41 / 64 3.7s Htbl parallel [ ] 43 0 0 43 / 64 3.9s Htbl parallel [ ] 45 0 0 45 / 64 4.1s Htbl parallel [ ] 46 0 0 46 / 64 4.3s Htbl parallel [ ] 49 0 0 49 / 64 4.5s Htbl parallel [ ] 51 0 0 51 / 64 4.6s Htbl parallel [ ] 52 0 0 52 / 64 4.7s Htbl parallel [ ] 53 0 0 53 / 64 4.8s Htbl parallel [ ] 54 0 0 54 / 64 5.0s Htbl parallel [ ] 56 0 0 56 / 64 5.1s Htbl parallel [ ] 57 0 0 57 / 64 5.3s Htbl parallel [ ] 58 0 0 58 / 64 5.5s Htbl parallel [ ] 60 0 0 60 / 64 5.7s Htbl parallel [ ] 63 0 0 63 / 64 5.8s Htbl parallel [✓] 64 0 0 64 / 64 5.8s Htbl parallel ================================================================================ success (ran 2 tests) random seed: 3127657213943184309 generated error fail pass / total time test name [ ] 0 0 0 0 / 128 0.0s Htbl sequential [✓] 128 0 0 128 / 128 0.0s Htbl sequential [ ] 0 0 0 0 / 128 0.0s Htbl parallel [ ] 2 0 0 2 / 128 0.1s Htbl parallel [ ] 4 0 0 4 / 128 0.3s Htbl parallel [ ] 7 0 0 7 / 128 0.5s Htbl parallel [ ] 8 0 0 8 / 128 0.8s Htbl parallel [ ] 13 0 0 13 / 128 0.9s Htbl parallel [ ] 17 0 0 17 / 128 1.0s Htbl parallel [ ] 20 0 0 20 / 128 1.1s Htbl parallel [ ] 24 0 0 24 / 128 1.2s Htbl parallel [ ] 28 0 0 28 / 128 1.4s Htbl parallel [ ] 32 0 0 32 / 128 1.6s Htbl parallel [ ] 34 0 0 34 / 128 1.7s Htbl parallel [ ] 37 0 0 37 / 128 1.9s Htbl parallel [ ] 40 0 0 40 / 128 2.0s Htbl parallel [ ] 41 0 0 41 / 128 2.2s Htbl parallel [ ] 45 0 0 45 / 128 2.5s Htbl parallel [ ] 47 0 0 47 / 128 2.6s Htbl parallel [ ] 50 0 0 50 / 128 2.8s Htbl parallel [ ] 54 0 0 54 / 128 2.9s Htbl parallel [ ] 57 0 0 57 / 128 3.0s Htbl parallel [ ] 60 0 0 60 / 128 3.2s Htbl parallel [ ] 63 0 0 63 / 128 3.3s Htbl parallel [ ] 67 0 0 67 / 128 3.4s Htbl parallel [ ] 69 0 0 69 / 128 3.5s Htbl parallel [ ] 70 0 0 70 / 128 3.6s Htbl parallel [ ] 73 0 0 73 / 128 3.8s Htbl parallel [ ] 78 0 0 78 / 128 3.9s Htbl parallel [ ] 80 0 0 80 / 128 4.0s Htbl parallel [ ] 83 0 0 83 / 128 4.1s Htbl parallel [ ] 84 0 0 84 / 128 4.4s Htbl parallel [ ] 88 0 0 88 / 128 4.6s Htbl parallel [ ] 92 0 0 92 / 128 4.7s Htbl parallel [ ] 96 0 0 96 / 128 4.8s Htbl parallel [ ] 99 0 0 99 / 128 4.9s Htbl parallel [ ] 101 0 0 101 / 128 5.3s Htbl parallel [ ] 104 0 0 104 / 128 5.4s Htbl parallel [ ] 106 0 0 106 / 128 5.5s Htbl parallel [ ] 108 0 0 108 / 128 5.7s Htbl parallel [ ] 109 0 0 109 / 128 5.8s Htbl parallel [ ] 111 0 0 111 / 128 6.0s Htbl parallel [ ] 112 0 0 112 / 128 6.1s Htbl parallel [ ] 116 0 0 116 / 128 6.2s Htbl parallel [ ] 117 0 0 117 / 128 6.6s Htbl parallel [ ] 118 0 0 118 / 128 6.7s Htbl parallel [ ] 120 0 0 120 / 128 6.9s Htbl parallel [ ] 121 0 0 121 / 128 7.1s Htbl parallel [ ] 122 0 0 122 / 128 7.2s Htbl parallel [ ] 123 0 0 123 / 128 7.4s Htbl parallel [ ] 124 0 0 124 / 128 7.6s Htbl parallel [ ] 127 0 0 127 / 128 7.7s Htbl parallel [✓] 128 0 0 128 / 128 7.8s Htbl parallel ================================================================================ success (ran 2 tests) random seed: 2034419052730520434 generated error fail pass / total time test name [ ] 0 0 0 0 / 256 0.0s Htbl sequential [✓] 256 0 0 256 / 256 0.0s Htbl sequential [ ] 0 0 0 0 / 256 0.0s Htbl parallel [ ] 2 0 0 2 / 256 0.2s Htbl parallel [ ] 3 0 0 3 / 256 0.3s Htbl parallel [ ] 5 0 0 5 / 256 0.6s Htbl parallel [ ] 9 0 0 9 / 256 0.7s Htbl parallel [ ] 14 0 0 14 / 256 0.8s Htbl parallel [ ] 16 0 0 16 / 256 1.0s Htbl parallel [ ] 17 0 0 17 / 256 1.2s Htbl parallel [ ] 19 0 0 19 / 256 1.5s Htbl parallel [ ] 20 0 0 20 / 256 1.8s Htbl parallel [ ] 21 0 0 21 / 256 2.9s Htbl parallel [ ] 22 0 0 22 / 256 3.1s Htbl parallel [ ] 23 0 0 23 / 256 3.5s Htbl parallel [ ] 24 0 0 24 / 256 3.7s Htbl parallel [ ] 28 0 0 28 / 256 3.9s Htbl parallel [ ] 32 0 0 32 / 256 4.0s Htbl parallel [ ] 34 0 0 34 / 256 4.2s Htbl parallel [ ] 35 0 0 35 / 256 4.4s Htbl parallel [ ] 38 0 0 38 / 256 4.6s Htbl parallel [ ] 42 0 0 42 / 256 4.8s Htbl parallel [ ] 45 0 0 45 / 256 5.0s Htbl parallel [ ] 46 0 0 46 / 256 5.1s Htbl parallel [ ] 48 0 0 48 / 256 5.4s Htbl parallel [ ] 49 0 0 49 / 256 5.6s Htbl parallel [ ] 50 0 0 50 / 256 5.8s Htbl parallel [ ] 51 0 0 51 / 256 6.4s Htbl parallel [ ] 52 0 0 52 / 256 6.8s Htbl parallel [ ] 53 0 0 53 / 256 6.9s Htbl parallel [ ] 54 0 0 54 / 256 7.7s Htbl parallel [ ] 57 0 0 57 / 256 8.0s Htbl parallel [ ] 62 0 0 62 / 256 8.1s Htbl parallel [ ] 67 0 0 67 / 256 8.3s Htbl parallel [ ] 68 0 0 68 / 256 8.5s Htbl parallel [ ] 69 0 0 69 / 256 8.9s Htbl parallel [ ] 70 0 0 70 / 256 9.3s Htbl parallel [ ] 71 0 0 71 / 256 10.0s Htbl parallel [ ] 72 0 0 72 / 256 10.8s Htbl parallel [ ] 73 0 0 73 / 256 11.4s Htbl parallel [ ] 74 0 0 74 / 256 11.6s Htbl parallel [ ] 75 0 0 75 / 256 12.2s Htbl parallel [ ] 77 0 0 77 / 256 12.5s Htbl parallel [ ] 78 0 0 78 / 256 13.0s Htbl parallel [ ] 79 0 0 79 / 256 13.4s Htbl parallel [ ] 80 0 0 80 / 256 13.9s Htbl parallel [ ] 81 0 0 81 / 256 14.2s Htbl parallel [ ] 82 0 0 82 / 256 14.6s Htbl parallel [ ] 83 0 0 83 / 256 15.0s Htbl parallel [ ] 84 0 0 84 / 256 15.3s Htbl parallel [ ] 85 0 0 85 / 256 15.6s Htbl parallel [ ] 86 0 0 86 / 256 15.9s Htbl parallel [ ] 87 0 0 87 / 256 16.0s Htbl parallel [ ] 88 0 0 88 / 256 16.4s Htbl parallel [ ] 89 0 0 89 / 256 17.1s Htbl parallel [ ] 90 0 0 90 / 256 17.5s Htbl parallel [ ] 91 0 0 91 / 256 17.7s Htbl parallel [ ] 92 0 0 92 / 256 18.1s Htbl parallel [ ] 93 0 0 93 / 256 18.3s Htbl parallel [ ] 94 0 0 94 / 256 18.4s Htbl parallel [ ] 95 0 0 95 / 256 18.9s Htbl parallel [ ] 96 0 0 96 / 256 19.4s Htbl parallel [ ] 97 0 0 97 / 256 20.7s Htbl parallel [ ] 98 0 0 98 / 256 21.7s Htbl parallel [ ] 99 0 0 99 / 256 22.1s Htbl parallel [ ] 100 0 0 100 / 256 22.4s Htbl parallel [ ] 101 0 0 101 / 256 22.5s Htbl parallel [ ] 102 0 0 102 / 256 23.2s Htbl parallel [ ] 103 0 0 103 / 256 23.6s Htbl parallel [ ] 104 0 0 104 / 256 24.0s Htbl parallel [ ] 105 0 0 105 / 256 24.7s Htbl parallel [ ] 106 0 0 106 / 256 25.5s Htbl parallel [ ] 107 0 0 107 / 256 26.0s Htbl parallel [ ] 108 0 0 108 / 256 26.3s Htbl parallel [ ] 109 0 0 109 / 256 26.9s Htbl parallel [ ] 110 0 0 110 / 256 28.0s Htbl parallel [ ] 111 0 0 111 / 256 28.9s Htbl parallel [ ] 113 0 0 113 / 256 29.4s Htbl parallel [ ] 114 0 0 114 / 256 29.9s Htbl parallel [ ] 115 0 0 115 / 256 30.0s Htbl parallel [ ] 116 0 0 116 / 256 30.7s Htbl parallel [ ] 117 0 0 117 / 256 31.5s Htbl parallel [ ] 118 0 0 118 / 256 33.0s Htbl parallel [ ] 119 0 0 119 / 256 33.1s Htbl parallel [ ] 120 0 0 120 / 256 33.2s Htbl parallel [ ] 121 0 0 121 / 256 33.5s Htbl parallel [ ] 122 0 0 122 / 256 33.8s Htbl parallel [ ] 123 0 0 123 / 256 34.1s Htbl parallel [ ] 124 0 0 124 / 256 34.2s Htbl parallel [ ] 125 0 0 125 / 256 34.5s Htbl parallel [ ] 126 0 0 126 / 256 35.1s Htbl parallel [ ] 127 0 0 127 / 256 35.8s Htbl parallel [ ] 128 0 0 128 / 256 36.0s Htbl parallel [ ] 130 0 0 130 / 256 36.2s Htbl parallel [ ] 131 0 0 131 / 256 36.4s Htbl parallel [ ] 132 0 0 132 / 256 36.6s Htbl parallel [ ] 133 0 0 133 / 256 36.8s Htbl parallel [ ] 134 0 0 134 / 256 36.9s Htbl parallel [ ] 135 0 0 135 / 256 37.3s Htbl parallel [ ] 136 0 0 136 / 256 37.5s Htbl parallel [ ] 137 0 0 137 / 256 38.4s Htbl parallel [ ] 138 0 0 138 / 256 38.7s Htbl parallel [ ] 139 0 0 139 / 256 39.0s Htbl parallel [ ] 140 0 0 140 / 256 39.4s Htbl parallel [ ] 141 0 0 141 / 256 39.9s Htbl parallel [ ] 142 0 0 142 / 256 40.0s Htbl parallel [ ] 143 0 0 143 / 256 40.4s Htbl parallel [ ] 144 0 0 144 / 256 40.7s Htbl parallel [ ] 145 0 0 145 / 256 41.0s Htbl parallel [ ] 146 0 0 146 / 256 41.3s Htbl parallel [ ] 147 0 0 147 / 256 41.8s Htbl parallel [ ] 148 0 0 148 / 256 42.2s Htbl parallel [ ] 149 0 0 149 / 256 42.8s Htbl parallel [ ] 150 0 0 150 / 256 43.4s Htbl parallel [ ] 151 0 0 151 / 256 43.6s Htbl parallel [ ] 152 0 0 152 / 256 43.9s Htbl parallel [ ] 153 0 0 153 / 256 44.3s Htbl parallel [ ] 154 0 0 154 / 256 44.7s Htbl parallel [ ] 155 0 0 155 / 256 45.2s Htbl parallel [ ] 156 0 0 156 / 256 46.0s Htbl parallel [ ] 158 0 0 158 / 256 46.2s Htbl parallel [ ] 159 0 0 159 / 256 46.6s Htbl parallel [ ] 160 0 0 160 / 256 47.0s Htbl parallel [ ] 161 0 0 161 / 256 48.0s Htbl parallel [ ] 162 0 0 162 / 256 48.8s Htbl parallel [ ] 163 0 0 163 / 256 49.5s Htbl parallel [ ] 164 0 0 164 / 256 49.9s Htbl parallel [ ] 165 0 0 165 / 256 50.6s Htbl parallel [ ] 166 0 0 166 / 256 51.2s Htbl parallel [ ] 167 0 0 167 / 256 51.6s Htbl parallel [ ] 168 0 0 168 / 256 52.4s Htbl parallel [ ] 169 0 0 169 / 256 52.9s Htbl parallel [ ] 170 0 0 170 / 256 53.5s Htbl parallel [ ] 171 0 0 171 / 256 53.8s Htbl parallel [ ] 172 0 0 172 / 256 54.5s Htbl parallel [ ] 173 0 0 173 / 256 55.7s Htbl parallel [ ] 174 0 0 174 / 256 56.5s Htbl parallel [ ] 175 0 0 175 / 256 57.0s Htbl parallel [ ] 176 0 0 176 / 256 57.6s Htbl parallel [ ] 177 0 0 177 / 256 58.3s Htbl parallel [ ] 179 0 0 179 / 256 58.7s Htbl parallel [ ] 180 0 0 180 / 256 58.9s Htbl parallel [ ] 181 0 0 181 / 256 59.0s Htbl parallel [ ] 182 0 0 182 / 256 59.5s Htbl parallel [ ] 183 0 0 183 / 256 60.0s Htbl parallel [ ] 184 0 0 184 / 256 60.1s Htbl parallel [ ] 185 0 0 185 / 256 60.6s Htbl parallel [ ] 186 0 0 186 / 256 61.1s Htbl parallel [ ] 188 0 0 188 / 256 61.4s Htbl parallel [ ] 189 0 0 189 / 256 61.7s Htbl parallel [ ] 190 0 0 190 / 256 61.8s Htbl parallel [ ] 190 0 0 190 / 256 62.0s Htbl parallel (generating) [ ] 191 0 0 191 / 256 62.4s Htbl parallel [ ] 192 0 0 192 / 256 62.7s Htbl parallel [ ] 193 0 0 193 / 256 62.9s Htbl parallel [ ] 194 0 0 194 / 256 63.2s Htbl parallel [ ] 195 0 0 195 / 256 63.4s Htbl parallel [ ] 196 0 0 196 / 256 63.7s Htbl parallel [ ] 197 0 0 197 / 256 64.4s Htbl parallel [ ] 198 0 0 198 / 256 65.5s Htbl parallel [ ] 199 0 0 199 / 256 65.8s Htbl parallel [ ] 200 0 0 200 / 256 66.4s Htbl parallel [ ] 201 0 0 201 / 256 66.7s Htbl parallel [ ] 202 0 0 202 / 256 67.8s Htbl parallel [ ] 203 0 0 203 / 256 68.2s Htbl parallel [ ] 204 0 0 204 / 256 68.7s Htbl parallel [ ] 205 0 0 205 / 256 69.2s Htbl parallel [ ] 206 0 0 206 / 256 69.3s Htbl parallel [ ] 207 0 0 207 / 256 69.5s Htbl parallel [ ] 208 0 0 208 / 256 70.3s Htbl parallel [ ] 209 0 0 209 / 256 70.7s Htbl parallel [ ] 210 0 0 210 / 256 71.4s Htbl parallel [ ] 211 0 0 211 / 256 71.7s Htbl parallel [ ] 212 0 0 212 / 256 72.2s Htbl parallel [ ] 213 0 0 213 / 256 72.3s Htbl parallel [ ] 214 0 0 214 / 256 73.3s Htbl parallel [ ] 215 0 0 215 / 256 74.1s Htbl parallel [ ] 216 0 0 216 / 256 74.7s Htbl parallel [ ] 217 0 0 217 / 256 75.8s Htbl parallel [ ] 218 0 0 218 / 256 76.7s Htbl parallel [ ] 219 0 0 219 / 256 77.0s Htbl parallel [ ] 220 0 0 220 / 256 78.0s Htbl parallel [ ] 221 0 0 221 / 256 79.2s Htbl parallel [ ] 222 0 0 222 / 256 80.0s Htbl parallel [ ] 223 0 0 223 / 256 80.7s Htbl parallel [ ] 224 0 0 224 / 256 81.7s Htbl parallel [ ] 225 0 0 225 / 256 83.3s Htbl parallel [ ] 226 0 0 226 / 256 84.1s Htbl parallel [ ] 227 0 0 227 / 256 85.0s Htbl parallel [ ] 228 0 0 228 / 256 86.0s Htbl parallel [ ] 229 0 0 229 / 256 87.4s Htbl parallel [ ] 230 0 0 230 / 256 88.7s Htbl parallel [ ] 231 0 0 231 / 256 89.2s Htbl parallel [ ] 232 0 0 232 / 256 90.2s Htbl parallel [ ] 233 0 0 233 / 256 90.8s Htbl parallel [ ] 234 0 0 234 / 256 92.2s Htbl parallel [ ] 235 0 0 235 / 256 92.7s Htbl parallel [ ] 236 0 0 236 / 256 93.4s Htbl parallel [ ] 237 0 0 237 / 256 94.2s Htbl parallel [ ] 238 0 0 238 / 256 95.4s Htbl parallel [ ] 239 0 0 239 / 256 96.5s Htbl parallel [ ] 240 0 0 240 / 256 97.6s Htbl parallel [ ] 241 0 0 241 / 256 98.3s Htbl parallel [ ] 242 0 0 242 / 256 98.9s Htbl parallel [ ] 243 0 0 243 / 256 100.4s Htbl parallel [ ] 244 0 0 244 / 256 100.7s Htbl parallel [ ] 245 0 0 245 / 256 100.8s Htbl parallel [ ] 246 0 0 246 / 256 101.1s Htbl parallel [ ] 247 0 0 247 / 256 101.4s Htbl parallel [ ] 248 0 0 248 / 256 101.7s Htbl parallel [ ] 249 0 0 249 / 256 102.2s Htbl parallel [ ] 250 0 0 250 / 256 103.5s Htbl parallel [ ] 251 0 0 251 / 256 105.2s Htbl parallel [ ] 252 0 0 252 / 256 105.4s Htbl parallel [ ] 253 0 0 253 / 256 106.0s Htbl parallel [ ] 254 0 0 254 / 256 107.6s Htbl parallel [ ] 255 0 0 255 / 256 109.0s Htbl parallel [ ] 256 0 0 256 / 256 109.7s Htbl parallel [✓] 256 0 0 256 / 256 109.7s Htbl parallel ================================================================================ success (ran 2 tests) (cd _build/default/test && .\test_sem.exe) random seed: 2729649587872972205 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Sem sequential [✓] 32 0 0 32 / 32 0.0s Sem sequential [ ] 0 0 0 0 / 32 0.0s Sem parallel [ ] 1 0 0 1 / 32 1.2s Sem parallel [ ] 2 0 0 2 / 32 1.7s Sem parallel [ ] 3 0 0 3 / 32 2.3s Sem parallel [ ] 4 0 0 4 / 32 2.5s Sem parallel [ ] 5 0 0 5 / 32 3.3s Sem parallel [ ] 6 0 0 6 / 32 4.0s Sem parallel [ ] 7 0 0 7 / 32 4.9s Sem parallel [ ] 8 0 0 8 / 32 5.4s Sem parallel [ ] 9 0 0 9 / 32 5.6s Sem parallel [ ] 10 0 0 10 / 32 6.3s Sem parallel [ ] 11 0 0 11 / 32 7.2s Sem parallel [ ] 12 0 0 12 / 32 8.1s Sem parallel [ ] 13 0 0 13 / 32 9.2s Sem parallel [ ] 14 0 0 14 / 32 9.5s Sem parallel [ ] 15 0 0 15 / 32 10.3s Sem parallel [ ] 16 0 0 16 / 32 10.8s Sem parallel [ ] 18 0 0 18 / 32 11.2s Sem parallel [ ] 19 0 0 19 / 32 11.4s Sem parallel [ ] 20 0 0 20 / 32 11.5s Sem parallel [ ] 21 0 0 21 / 32 12.0s Sem parallel [ ] 22 0 0 22 / 32 12.4s Sem parallel [ ] 23 0 0 23 / 32 12.6s Sem parallel [ ] 24 0 0 24 / 32 13.4s Sem parallel [ ] 25 0 0 25 / 32 13.6s Sem parallel [ ] 27 0 0 27 / 32 14.0s Sem parallel [ ] 28 0 0 28 / 32 14.3s Sem parallel [ ] 28 0 0 28 / 32 14.5s Sem parallel (generating) [ ] 29 0 0 29 / 32 14.9s Sem parallel [ ] 30 0 0 30 / 32 15.2s Sem parallel [ ] 31 0 0 31 / 32 15.5s Sem parallel [ ] 32 0 0 32 / 32 15.7s Sem parallel [✓] 32 0 0 32 / 32 15.7s Sem parallel ================================================================================ success (ran 2 tests) random seed: 1710201572889975751 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Sem sequential [✓] 64 0 0 64 / 64 0.0s Sem sequential [ ] 0 0 0 0 / 64 0.0s Sem parallel [ ] 1 0 0 1 / 64 0.2s Sem parallel [ ] 2 0 0 2 / 64 0.5s Sem parallel [ ] 3 0 0 3 / 64 1.4s Sem parallel [ ] 4 0 0 4 / 64 2.3s Sem parallel [ ] 5 0 0 5 / 64 2.7s Sem parallel [ ] 6 0 0 6 / 64 3.2s Sem parallel [ ] 7 0 0 7 / 64 3.6s Sem parallel [ ] 8 0 0 8 / 64 4.6s Sem parallel [ ] 9 0 0 9 / 64 5.2s Sem parallel [ ] 10 0 0 10 / 64 5.7s Sem parallel [ ] 11 0 0 11 / 64 6.1s Sem parallel [ ] 12 0 0 12 / 64 6.2s Sem parallel [ ] 13 0 0 13 / 64 7.0s Sem parallel [ ] 14 0 0 14 / 64 7.3s Sem parallel [ ] 15 0 0 15 / 64 7.9s Sem parallel [ ] 16 0 0 16 / 64 8.2s Sem parallel [ ] 17 0 0 17 / 64 8.9s Sem parallel [ ] 18 0 0 18 / 64 9.0s Sem parallel [ ] 19 0 0 19 / 64 10.1s Sem parallel [ ] 20 0 0 20 / 64 10.9s Sem parallel [ ] 21 0 0 21 / 64 11.5s Sem parallel [ ] 22 0 0 22 / 64 12.6s Sem parallel [ ] 23 0 0 23 / 64 13.4s Sem parallel [ ] 24 0 0 24 / 64 13.7s Sem parallel [ ] 25 0 0 25 / 64 14.8s Sem parallel [ ] 26 0 0 26 / 64 15.7s Sem parallel [ ] 27 0 0 27 / 64 16.7s Sem parallel [ ] 28 0 0 28 / 64 17.2s Sem parallel [ ] 29 0 0 29 / 64 18.0s Sem parallel [ ] 30 0 0 30 / 64 19.6s Sem parallel [ ] 31 0 0 31 / 64 20.5s Sem parallel [ ] 32 0 0 32 / 64 21.4s Sem parallel [ ] 33 0 0 33 / 64 22.5s Sem parallel [ ] 34 0 0 34 / 64 23.8s Sem parallel [ ] 35 0 0 35 / 64 25.2s Sem parallel [ ] 36 0 0 36 / 64 25.8s Sem parallel [ ] 37 0 0 37 / 64 26.5s Sem parallel [ ] 38 0 0 38 / 64 27.3s Sem parallel [ ] 39 0 0 39 / 64 28.1s Sem parallel [ ] 40 0 0 40 / 64 29.3s Sem parallel [ ] 41 0 0 41 / 64 29.7s Sem parallel [ ] 42 0 0 42 / 64 30.4s Sem parallel [ ] 43 0 0 43 / 64 31.5s Sem parallel [ ] 44 0 0 44 / 64 32.7s Sem parallel [ ] 45 0 0 45 / 64 33.5s Sem parallel [ ] 46 0 0 46 / 64 34.4s Sem parallel [ ] 47 0 0 47 / 64 35.1s Sem parallel [ ] 48 0 0 48 / 64 35.6s Sem parallel [ ] 49 0 0 49 / 64 36.7s Sem parallel [ ] 50 0 0 50 / 64 37.4s Sem parallel [ ] 51 0 0 51 / 64 37.6s Sem parallel [ ] 52 0 0 52 / 64 37.8s Sem parallel [ ] 53 0 0 53 / 64 38.2s Sem parallel [ ] 54 0 0 54 / 64 38.5s Sem parallel [ ] 55 0 0 55 / 64 39.0s Sem parallel [ ] 56 0 0 56 / 64 39.6s Sem parallel [ ] 57 0 0 57 / 64 41.1s Sem parallel [ ] 58 0 0 58 / 64 42.1s Sem parallel [ ] 59 0 0 59 / 64 42.6s Sem parallel [ ] 60 0 0 60 / 64 43.9s Sem parallel [ ] 61 0 0 61 / 64 44.6s Sem parallel [ ] 62 0 0 62 / 64 45.7s Sem parallel [ ] 63 0 0 63 / 64 46.3s Sem parallel [ ] 64 0 0 64 / 64 47.4s Sem parallel [✓] 64 0 0 64 / 64 47.4s Sem parallel ================================================================================ success (ran 2 tests) (cd _build/default/test && .\test_sync.exe -- "^Mutex and Condition$" 1) Testing `Picos_sync'. This run has ID `FXA23WEG'. [SKIP] Mutex and Condition 0 basics. [OK] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\FXA23WEG'. Test Successful in 0.109s. 1 test run. (cd _build/default/bench && .\main.exe -brief "Picos Computation") Picos Computation: attach detach pairs over time/1 worker: 1.25 M/s attach detach pairs over time/2 workers: 1.71 M/s attach detach pairs over time/4 workers: 1.73 M/s attach detach pairs over time/trivial: 4.78 M/s time per attach detach pair/1 worker: 801.57 ns time per attach detach pair/2 workers: 1166.95 ns time per attach detach pair/4 workers: 2315.57 ns time per attach detach pair/trivial: 209.35 ns (cd _build/default/test && .\test_mpmcq.exe) random seed: 1330188548922606205 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Mpmcq sequential [✓] 32 0 0 32 / 32 0.0s Mpmcq sequential [ ] 0 0 0 0 / 32 0.0s Mpmcq parallel [ ] 1 0 0 1 / 32 0.1s Mpmcq parallel [ ] 2 0 0 2 / 32 0.8s Mpmcq parallel [ ] 3 0 0 3 / 32 1.1s Mpmcq parallel [ ] 4 0 0 4 / 32 1.2s Mpmcq parallel [ ] 5 0 0 5 / 32 1.3s Mpmcq parallel [ ] 6 0 0 6 / 32 1.6s Mpmcq parallel [ ] 7 0 0 7 / 32 1.7s Mpmcq parallel [ ] 9 0 0 9 / 32 1.9s Mpmcq parallel [ ] 10 0 0 10 / 32 2.1s Mpmcq parallel [ ] 11 0 0 11 / 32 2.2s Mpmcq parallel [ ] 13 0 0 13 / 32 2.3s Mpmcq parallel [ ] 16 0 0 16 / 32 2.4s Mpmcq parallel [ ] 18 0 0 18 / 32 2.6s Mpmcq parallel [ ] 20 0 0 20 / 32 2.8s Mpmcq parallel [ ] 22 0 0 22 / 32 3.2s Mpmcq parallel [ ] 23 0 0 23 / 32 3.4s Mpmcq parallel [ ] 25 0 0 25 / 32 3.7s Mpmcq parallel [ ] 26 0 0 26 / 32 3.9s Mpmcq parallel [ ] 28 0 0 28 / 32 4.2s Mpmcq parallel [ ] 29 0 0 29 / 32 4.3s Mpmcq parallel [ ] 30 0 0 30 / 32 4.4s Mpmcq parallel [✓] 32 0 0 32 / 32 4.5s Mpmcq parallel ================================================================================ success (ran 2 tests) random seed: 2682757953233201891 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Mpmcq sequential [✓] 64 0 0 64 / 64 0.0s Mpmcq sequential [ ] 0 0 0 0 / 64 0.0s Mpmcq parallel [ ] 3 0 0 3 / 64 0.1s Mpmcq parallel [ ] 4 0 0 4 / 64 0.3s Mpmcq parallel [ ] 5 0 0 5 / 64 0.6s Mpmcq parallel [ ] 7 0 0 7 / 64 0.7s Mpmcq parallel [ ] 9 0 0 9 / 64 0.9s Mpmcq parallel [ ] 11 0 0 11 / 64 1.1s Mpmcq parallel [ ] 12 0 0 12 / 64 1.2s Mpmcq parallel [ ] 14 0 0 14 / 64 1.4s Mpmcq parallel [ ] 16 0 0 16 / 64 1.5s Mpmcq parallel [ ] 18 0 0 18 / 64 1.7s Mpmcq parallel [ ] 20 0 0 20 / 64 1.8s Mpmcq parallel [ ] 21 0 0 21 / 64 1.9s Mpmcq parallel [ ] 23 0 0 23 / 64 2.0s Mpmcq parallel [ ] 25 0 0 25 / 64 2.2s Mpmcq parallel [ ] 27 0 0 27 / 64 2.5s Mpmcq parallel [ ] 29 0 0 29 / 64 2.7s Mpmcq parallel [ ] 30 0 0 30 / 64 2.8s Mpmcq parallel [ ] 31 0 0 31 / 64 3.1s Mpmcq parallel [ ] 33 0 0 33 / 64 3.4s Mpmcq parallel [ ] 34 0 0 34 / 64 3.5s Mpmcq parallel [ ] 35 0 0 35 / 64 3.7s Mpmcq parallel [ ] 37 0 0 37 / 64 3.9s Mpmcq parallel [ ] 38 0 0 38 / 64 4.0s Mpmcq parallel [ ] 39 0 0 39 / 64 4.2s Mpmcq parallel [ ] 41 0 0 41 / 64 4.3s Mpmcq parallel [ ] 44 0 0 44 / 64 4.5s Mpmcq parallel [ ] 46 0 0 46 / 64 4.7s Mpmcq parallel [ ] 47 0 0 47 / 64 4.9s Mpmcq parallel [ ] 49 0 0 49 / 64 5.0s Mpmcq parallel [ ] 50 0 0 50 / 64 5.2s Mpmcq parallel [ ] 51 0 0 51 / 64 5.5s Mpmcq parallel [ ] 52 0 0 52 / 64 5.6s Mpmcq parallel [ ] 55 0 0 55 / 64 5.7s Mpmcq parallel [ ] 58 0 0 58 / 64 5.8s Mpmcq parallel [ ] 61 0 0 61 / 64 6.0s Mpmcq parallel [ ] 62 0 0 62 / 64 6.1s Mpmcq parallel [✓] 64 0 0 64 / 64 6.2s Mpmcq parallel ================================================================================ success (ran 2 tests) random seed: 661262644165683014 generated error fail pass / total time test name [ ] 0 0 0 0 / 128 0.0s Mpmcq sequential [✓] 128 0 0 128 / 128 0.0s Mpmcq sequential [ ] 0 0 0 0 / 128 0.0s Mpmcq parallel [ ] 4 0 0 4 / 128 0.1s Mpmcq parallel [ ] 7 0 0 7 / 128 0.2s Mpmcq parallel [ ] 8 0 0 8 / 128 0.4s Mpmcq parallel [ ] 11 0 0 11 / 128 0.5s Mpmcq parallel [ ] 15 0 0 15 / 128 0.7s Mpmcq parallel [ ] 20 0 0 20 / 128 0.9s Mpmcq parallel [ ] 22 0 0 22 / 128 1.0s Mpmcq parallel [ ] 27 0 0 27 / 128 1.1s Mpmcq parallel [ ] 29 0 0 29 / 128 1.2s Mpmcq parallel [ ] 31 0 0 31 / 128 1.4s Mpmcq parallel [ ] 36 0 0 36 / 128 1.5s Mpmcq parallel [ ] 39 0 0 39 / 128 1.6s Mpmcq parallel [ ] 42 0 0 42 / 128 1.7s Mpmcq parallel [ ] 45 0 0 45 / 128 2.0s Mpmcq parallel [ ] 49 0 0 49 / 128 2.1s Mpmcq parallel [ ] 51 0 0 51 / 128 2.2s Mpmcq parallel [ ] 53 0 0 53 / 128 2.3s Mpmcq parallel [ ] 58 0 0 58 / 128 2.5s Mpmcq parallel [ ] 59 0 0 59 / 128 2.7s Mpmcq parallel [ ] 62 0 0 62 / 128 2.9s Mpmcq parallel [ ] 65 0 0 65 / 128 3.1s Mpmcq parallel [ ] 66 0 0 66 / 128 3.3s Mpmcq parallel [ ] 71 0 0 71 / 128 3.5s Mpmcq parallel [ ] 73 0 0 73 / 128 3.7s Mpmcq parallel [ ] 77 0 0 77 / 128 3.8s Mpmcq parallel [ ] 82 0 0 82 / 128 3.9s Mpmcq parallel [ ] 87 0 0 87 / 128 4.0s Mpmcq parallel [ ] 88 0 0 88 / 128 4.2s Mpmcq parallel [ ] 92 0 0 92 / 128 4.3s Mpmcq parallel [ ] 96 0 0 96 / 128 4.5s Mpmcq parallel [ ] 97 0 0 97 / 128 4.6s Mpmcq parallel [ ] 99 0 0 99 / 128 4.8s Mpmcq parallel [ ] 104 0 0 104 / 128 4.9s Mpmcq parallel [ ] 106 0 0 106 / 128 5.1s Mpmcq parallel [ ] 108 0 0 108 / 128 5.3s Mpmcq parallel [ ] 110 0 0 110 / 128 5.4s Mpmcq parallel [ ] 112 0 0 112 / 128 5.6s Mpmcq parallel [ ] 116 0 0 116 / 128 6.2s Mpmcq parallel [ ] 118 0 0 118 / 128 6.4s Mpmcq parallel [ ] 119 0 0 119 / 128 6.6s Mpmcq parallel [ ] 120 0 0 120 / 128 6.7s Mpmcq parallel [ ] 122 0 0 122 / 128 7.0s Mpmcq parallel [ ] 123 0 0 123 / 128 7.1s Mpmcq parallel [ ] 126 0 0 126 / 128 7.4s Mpmcq parallel [ ] 128 0 0 128 / 128 7.5s Mpmcq parallel [✓] 128 0 0 128 / 128 7.5s Mpmcq parallel ================================================================================ success (ran 2 tests) random seed: 94331686940915149 generated error fail pass / total time test name [ ] 0 0 0 0 / 256 0.0s Mpmcq sequential [✓] 256 0 0 256 / 256 0.0s Mpmcq sequential [ ] 0 0 0 0 / 256 0.0s Mpmcq parallel [ ] 1 0 0 1 / 256 0.1s Mpmcq parallel [ ] 2 0 0 2 / 256 0.4s Mpmcq parallel [ ] 4 0 0 4 / 256 0.5s Mpmcq parallel [ ] 8 0 0 8 / 256 1.0s Mpmcq parallel [ ] 10 0 0 10 / 256 1.5s Mpmcq parallel [ ] 11 0 0 11 / 256 1.7s Mpmcq parallel [ ] 12 0 0 12 / 256 2.4s Mpmcq parallel [ ] 13 0 0 13 / 256 2.8s Mpmcq parallel [ ] 14 0 0 14 / 256 3.2s Mpmcq parallel [ ] 15 0 0 15 / 256 3.5s Mpmcq parallel [ ] 19 0 0 19 / 256 3.6s Mpmcq parallel [ ] 22 0 0 22 / 256 3.7s Mpmcq parallel [ ] 25 0 0 25 / 256 3.8s Mpmcq parallel [ ] 28 0 0 28 / 256 4.0s Mpmcq parallel [ ] 30 0 0 30 / 256 4.2s Mpmcq parallel [ ] 33 0 0 33 / 256 4.3s Mpmcq parallel [ ] 34 0 0 34 / 256 4.6s Mpmcq parallel [ ] 36 0 0 36 / 256 4.7s Mpmcq parallel [ ] 37 0 0 37 / 256 4.9s Mpmcq parallel [ ] 39 0 0 39 / 256 5.3s Mpmcq parallel [ ] 40 0 0 40 / 256 5.6s Mpmcq parallel [ ] 41 0 0 41 / 256 6.0s Mpmcq parallel [ ] 42 0 0 42 / 256 6.5s Mpmcq parallel [ ] 43 0 0 43 / 256 6.7s Mpmcq parallel [ ] 44 0 0 44 / 256 7.5s Mpmcq parallel [ ] 48 0 0 48 / 256 7.6s Mpmcq parallel [ ] 52 0 0 52 / 256 7.7s Mpmcq parallel [ ] 53 0 0 53 / 256 7.8s Mpmcq parallel [ ] 58 0 0 58 / 256 8.0s Mpmcq parallel [ ] 59 0 0 59 / 256 8.2s Mpmcq parallel [ ] 60 0 0 60 / 256 8.4s Mpmcq parallel [ ] 61 0 0 61 / 256 8.7s Mpmcq parallel [ ] 62 0 0 62 / 256 9.1s Mpmcq parallel [ ] 63 0 0 63 / 256 9.7s Mpmcq parallel [ ] 64 0 0 64 / 256 10.6s Mpmcq parallel [ ] 65 0 0 65 / 256 11.2s Mpmcq parallel [ ] 66 0 0 66 / 256 11.3s Mpmcq parallel [ ] 67 0 0 67 / 256 11.6s Mpmcq parallel [ ] 68 0 0 68 / 256 12.0s Mpmcq parallel [ ] 70 0 0 70 / 256 12.3s Mpmcq parallel [ ] 71 0 0 71 / 256 12.7s Mpmcq parallel [ ] 72 0 0 72 / 256 13.0s Mpmcq parallel [ ] 73 0 0 73 / 256 13.6s Mpmcq parallel [ ] 74 0 0 74 / 256 13.7s Mpmcq parallel [ ] 75 0 0 75 / 256 14.2s Mpmcq parallel [ ] 76 0 0 76 / 256 14.6s Mpmcq parallel [ ] 77 0 0 77 / 256 15.0s Mpmcq parallel [ ] 78 0 0 78 / 256 15.3s Mpmcq parallel [ ] 79 0 0 79 / 256 15.5s Mpmcq parallel [ ] 80 0 0 80 / 256 15.9s Mpmcq parallel [ ] 81 0 0 81 / 256 16.6s Mpmcq parallel [ ] 82 0 0 82 / 256 16.8s Mpmcq parallel [ ] 83 0 0 83 / 256 17.1s Mpmcq parallel [ ] 84 0 0 84 / 256 17.3s Mpmcq parallel [ ] 85 0 0 85 / 256 17.8s Mpmcq parallel [ ] 86 0 0 86 / 256 18.2s Mpmcq parallel [ ] 87 0 0 87 / 256 18.6s Mpmcq parallel [ ] 88 0 0 88 / 256 19.9s Mpmcq parallel [ ] 89 0 0 89 / 256 21.4s Mpmcq parallel [ ] 90 0 0 90 / 256 21.6s Mpmcq parallel [ ] 91 0 0 91 / 256 22.1s Mpmcq parallel [ ] 92 0 0 92 / 256 22.2s Mpmcq parallel [ ] 93 0 0 93 / 256 22.9s Mpmcq parallel [ ] 94 0 0 94 / 256 23.3s Mpmcq parallel [ ] 95 0 0 95 / 256 23.5s Mpmcq parallel [ ] 96 0 0 96 / 256 24.1s Mpmcq parallel [ ] 97 0 0 97 / 256 25.3s Mpmcq parallel [ ] 98 0 0 98 / 256 25.9s Mpmcq parallel [ ] 99 0 0 99 / 256 26.4s Mpmcq parallel [ ] 100 0 0 100 / 256 27.5s Mpmcq parallel [ ] 101 0 0 101 / 256 28.5s Mpmcq parallel [ ] 102 0 0 102 / 256 28.7s Mpmcq parallel [ ] 105 0 0 105 / 256 29.0s Mpmcq parallel [ ] 106 0 0 106 / 256 29.2s Mpmcq parallel [ ] 107 0 0 107 / 256 29.6s Mpmcq parallel [ ] 108 0 0 108 / 256 29.8s Mpmcq parallel [ ] 110 0 0 110 / 256 30.7s Mpmcq parallel [ ] 111 0 0 111 / 256 32.7s Mpmcq parallel [ ] 112 0 0 112 / 256 32.9s Mpmcq parallel [ ] 114 0 0 114 / 256 33.1s Mpmcq parallel [ ] 115 0 0 115 / 256 33.3s Mpmcq parallel [ ] 116 0 0 116 / 256 33.7s Mpmcq parallel [ ] 117 0 0 117 / 256 33.8s Mpmcq parallel [ ] 118 0 0 118 / 256 34.0s Mpmcq parallel [ ] 119 0 0 119 / 256 34.4s Mpmcq parallel [ ] 120 0 0 120 / 256 35.2s Mpmcq parallel [ ] 121 0 0 121 / 256 35.7s Mpmcq parallel [ ] 123 0 0 123 / 256 35.9s Mpmcq parallel [ ] 124 0 0 124 / 256 36.3s Mpmcq parallel [ ] 125 0 0 125 / 256 36.4s Mpmcq parallel [ ] 126 0 0 126 / 256 36.6s Mpmcq parallel [ ] 127 0 0 127 / 256 36.7s Mpmcq parallel [ ] 128 0 0 128 / 256 37.2s Mpmcq parallel [ ] 129 0 0 129 / 256 37.5s Mpmcq parallel [ ] 130 0 0 130 / 256 37.6s Mpmcq parallel [ ] 131 0 0 131 / 256 37.9s Mpmcq parallel [ ] 132 0 0 132 / 256 38.3s Mpmcq parallel [ ] 133 0 0 133 / 256 38.6s Mpmcq parallel [ ] 134 0 0 134 / 256 39.0s Mpmcq parallel [ ] 135 0 0 135 / 256 39.3s Mpmcq parallel [ ] 136 0 0 136 / 256 39.7s Mpmcq parallel [ ] 138 0 0 138 / 256 40.1s Mpmcq parallel [ ] 139 0 0 139 / 256 40.5s Mpmcq parallel [ ] 140 0 0 140 / 256 41.1s Mpmcq parallel [ ] 141 0 0 141 / 256 41.5s Mpmcq parallel [ ] 142 0 0 142 / 256 41.7s Mpmcq parallel [ ] 143 0 0 143 / 256 42.3s Mpmcq parallel [ ] 144 0 0 144 / 256 43.0s Mpmcq parallel [ ] 145 0 0 145 / 256 43.3s Mpmcq parallel [ ] 146 0 0 146 / 256 43.6s Mpmcq parallel [ ] 147 0 0 147 / 256 43.9s Mpmcq parallel [ ] 148 0 0 148 / 256 44.4s Mpmcq parallel [ ] 149 0 0 149 / 256 44.8s Mpmcq parallel [ ] 150 0 0 150 / 256 45.7s Mpmcq parallel [ ] 152 0 0 152 / 256 45.9s Mpmcq parallel [ ] 153 0 0 153 / 256 46.0s Mpmcq parallel [ ] 154 0 0 154 / 256 46.4s Mpmcq parallel [ ] 155 0 0 155 / 256 47.1s Mpmcq parallel [ ] 156 0 0 156 / 256 48.5s Mpmcq parallel [ ] 157 0 0 157 / 256 49.0s Mpmcq parallel [ ] 158 0 0 158 / 256 49.6s Mpmcq parallel [ ] 159 0 0 159 / 256 49.7s Mpmcq parallel [ ] 160 0 0 160 / 256 50.6s Mpmcq parallel [ ] 161 0 0 161 / 256 51.3s Mpmcq parallel [ ] 162 0 0 162 / 256 51.7s Mpmcq parallel [ ] 163 0 0 163 / 256 52.5s Mpmcq parallel [ ] 164 0 0 164 / 256 52.8s Mpmcq parallel [ ] 165 0 0 165 / 256 53.5s Mpmcq parallel [ ] 166 0 0 166 / 256 54.3s Mpmcq parallel [ ] 167 0 0 167 / 256 55.5s Mpmcq parallel [ ] 168 0 0 168 / 256 56.2s Mpmcq parallel [ ] 169 0 0 169 / 256 56.8s Mpmcq parallel [ ] 170 0 0 170 / 256 57.4s Mpmcq parallel [ ] 171 0 0 171 / 256 58.0s Mpmcq parallel [ ] 173 0 0 173 / 256 58.5s Mpmcq parallel [ ] 174 0 0 174 / 256 59.2s Mpmcq parallel [ ] 175 0 0 175 / 256 59.7s Mpmcq parallel [ ] 176 0 0 176 / 256 59.9s Mpmcq parallel [ ] 177 0 0 177 / 256 60.7s Mpmcq parallel [ ] 178 0 0 178 / 256 60.9s Mpmcq parallel [ ] 180 0 0 180 / 256 61.3s Mpmcq parallel [ ] 181 0 0 181 / 256 61.5s Mpmcq parallel [ ] 181 0 0 181 / 256 61.8s Mpmcq parallel (generating) [ ] 182 0 0 182 / 256 62.2s Mpmcq parallel [ ] 183 0 0 183 / 256 62.5s Mpmcq parallel [ ] 184 0 0 184 / 256 62.7s Mpmcq parallel [ ] 185 0 0 185 / 256 62.9s Mpmcq parallel [ ] 186 0 0 186 / 256 63.2s Mpmcq parallel [ ] 188 0 0 188 / 256 63.8s Mpmcq parallel [ ] 189 0 0 189 / 256 64.6s Mpmcq parallel [ ] 190 0 0 190 / 256 65.5s Mpmcq parallel [ ] 191 0 0 191 / 256 65.9s Mpmcq parallel [ ] 192 0 0 192 / 256 66.2s Mpmcq parallel [ ] 193 0 0 193 / 256 66.6s Mpmcq parallel [ ] 194 0 0 194 / 256 67.6s Mpmcq parallel [ ] 195 0 0 195 / 256 68.2s Mpmcq parallel [ ] 196 0 0 196 / 256 68.7s Mpmcq parallel [ ] 197 0 0 197 / 256 69.0s Mpmcq parallel [ ] 198 0 0 198 / 256 69.2s Mpmcq parallel [ ] 199 0 0 199 / 256 70.0s Mpmcq parallel [ ] 200 0 0 200 / 256 70.3s Mpmcq parallel [ ] 201 0 0 201 / 256 70.7s Mpmcq parallel [ ] 202 0 0 202 / 256 71.2s Mpmcq parallel [ ] 203 0 0 203 / 256 71.8s Mpmcq parallel [ ] 204 0 0 204 / 256 72.0s Mpmcq parallel [ ] 205 0 0 205 / 256 73.0s Mpmcq parallel [ ] 206 0 0 206 / 256 73.5s Mpmcq parallel [ ] 207 0 0 207 / 256 74.3s Mpmcq parallel [ ] 208 0 0 208 / 256 74.5s Mpmcq parallel [ ] 209 0 0 209 / 256 75.6s Mpmcq parallel [ ] 210 0 0 210 / 256 76.5s Mpmcq parallel [ ] 211 0 0 211 / 256 76.8s Mpmcq parallel [ ] 212 0 0 212 / 256 77.8s Mpmcq parallel [ ] 213 0 0 213 / 256 79.2s Mpmcq parallel [ ] 214 0 0 214 / 256 79.8s Mpmcq parallel [ ] 215 0 0 215 / 256 80.5s Mpmcq parallel [ ] 216 0 0 216 / 256 81.4s Mpmcq parallel [ ] 217 0 0 217 / 256 83.1s Mpmcq parallel [ ] 218 0 0 218 / 256 83.8s Mpmcq parallel [ ] 219 0 0 219 / 256 84.6s Mpmcq parallel [ ] 220 0 0 220 / 256 85.7s Mpmcq parallel [ ] 221 0 0 221 / 256 87.1s Mpmcq parallel [ ] 222 0 0 222 / 256 88.5s Mpmcq parallel [ ] 223 0 0 223 / 256 88.9s Mpmcq parallel [ ] 224 0 0 224 / 256 89.7s Mpmcq parallel [ ] 225 0 0 225 / 256 90.3s Mpmcq parallel [ ] 226 0 0 226 / 256 91.5s Mpmcq parallel [ ] 227 0 0 227 / 256 92.4s Mpmcq parallel [ ] 228 0 0 228 / 256 92.7s Mpmcq parallel [ ] 229 0 0 229 / 256 93.5s Mpmcq parallel [ ] 230 0 0 230 / 256 94.7s Mpmcq parallel [ ] 231 0 0 231 / 256 95.7s Mpmcq parallel [ ] 232 0 0 232 / 256 96.5s Mpmcq parallel [ ] 233 0 0 233 / 256 97.4s Mpmcq parallel [ ] 234 0 0 234 / 256 98.1s Mpmcq parallel [ ] 235 0 0 235 / 256 98.6s Mpmcq parallel [ ] 236 0 0 236 / 256 100.2s Mpmcq parallel [ ] 237 0 0 237 / 256 100.5s Mpmcq parallel [ ] 238 0 0 238 / 256 100.8s Mpmcq parallel [ ] 239 0 0 239 / 256 101.2s Mpmcq parallel [ ] 240 0 0 240 / 256 101.5s Mpmcq parallel [ ] 241 0 0 241 / 256 102.0s Mpmcq parallel [ ] 242 0 0 242 / 256 102.9s Mpmcq parallel [ ] 243 0 0 243 / 256 104.2s Mpmcq parallel [ ] 244 0 0 244 / 256 105.1s Mpmcq parallel [ ] 245 0 0 245 / 256 105.4s Mpmcq parallel [ ] 246 0 0 246 / 256 106.3s Mpmcq parallel [ ] 247 0 0 247 / 256 107.5s Mpmcq parallel [ ] 248 0 0 248 / 256 108.6s Mpmcq parallel [ ] 249 0 0 249 / 256 109.3s Mpmcq parallel [ ] 250 0 0 250 / 256 110.4s Mpmcq parallel [ ] 251 0 0 251 / 256 110.9s Mpmcq parallel [ ] 252 0 0 252 / 256 111.9s Mpmcq parallel [ ] 253 0 0 253 / 256 112.4s Mpmcq parallel [ ] 254 0 0 254 / 256 112.8s Mpmcq parallel [ ] 255 0 0 255 / 256 113.6s Mpmcq parallel [ ] 256 0 0 256 / 256 114.0s Mpmcq parallel [✓] 256 0 0 256 / 256 114.0s Mpmcq parallel ================================================================================ success (ran 2 tests) (cd _build/default/bench && .\main.exe -brief "Picos Current") Picos Current: ops over time/1 worker: 3.91 M/s ops over time/2 workers: 7.46 M/s ops over time/4 workers: 10.80 M/s time per op/1 worker: 255.75 ns time per op/2 workers: 268.12 ns time per op/4 workers: 370.41 ns (cd _build/default/bench && .\main.exe -brief "Picos FLS (excluding Current)") Picos FLS (excluding Current): gets over time/1 worker: 28.72 M/s gets over time/2 workers: 85.21 M/s gets over time/4 workers: 236.58 M/s sets over time/1 worker: 38.00 M/s sets over time/2 workers: 48.42 M/s sets over time/4 workers: 190.84 M/s time per get/1 worker: 34.82 ns time per get/2 workers: 23.47 ns time per get/4 workers: 16.91 ns time per set/1 worker: 26.31 ns time per set/2 workers: 41.31 ns time per set/4 workers: 20.96 ns (cd _build/default/test && .\test_sync.exe -- "^Mutex and Condition$" 2) Testing `Picos_sync'. This run has ID `Y3T7UXJT'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [OK] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\Y3T7UXJT'. Test Successful in 9.912s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Lock and Lock.Condition$" 0) Testing `Picos_sync'. This run has ID `IES29T1K'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [OK] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\IES29T1K'. Test Successful in 0.159s. 1 test run. (cd _build/default/bench && .\main.exe -brief "Picos TLS") Picos TLS: gets over time/1 worker: 9.70 M/s gets over time/2 workers: 12.95 M/s gets over time/4 workers: 31.57 M/s sets over time/1 worker: 6.53 M/s sets over time/2 workers: 22.05 M/s sets over time/4 workers: 27.60 M/s time per get/1 worker: 103.05 ns time per get/2 workers: 154.47 ns time per get/4 workers: 126.69 ns time per set/1 worker: 153.21 ns time per set/2 workers: 90.70 ns time per set/4 workers: 144.92 ns (cd _build/default/test && .\test_sync.exe -- "^Lock and Lock.Condition$" 1) Testing `Picos_sync'. This run has ID `4PNLLCD7'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [OK] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\4PNLLCD7'. Test Successful in 6.236s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Lock and Lock.Condition$" 2) Testing `Picos_sync'. This run has ID `CTWILWPE'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [OK] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\CTWILWPE'. Test Successful in 0.035s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Lock and Lock.Condition$" 3) Testing `Picos_sync'. This run has ID `0ELP7FI7'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [OK] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\0ELP7FI7'. Test Successful in 0.036s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Rwlock and Rwlock.Condition$" 0) Testing `Picos_sync'. This run has ID `VX5G6X25'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [OK] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\VX5G6X25'. Test Successful in 0.015s. 1 test run. (cd _build/default/test && .\test_rwlock.exe) random seed: 2883726959542640082 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Rwlock sequential [✓] 32 0 0 32 / 32 0.0s Rwlock sequential [ ] 0 0 0 0 / 32 0.0s Rwlock parallel [ ] 1 0 0 1 / 32 1.1s Rwlock parallel [ ] 2 0 0 2 / 32 1.8s Rwlock parallel [ ] 3 0 0 3 / 32 2.1s Rwlock parallel [ ] 4 0 0 4 / 32 2.6s Rwlock parallel [ ] 5 0 0 5 / 32 3.3s Rwlock parallel [ ] 6 0 0 6 / 32 4.0s Rwlock parallel [ ] 7 0 0 7 / 32 4.6s Rwlock parallel [ ] 8 0 0 8 / 32 5.1s Rwlock parallel [ ] 9 0 0 9 / 32 5.3s Rwlock parallel [ ] 11 0 0 11 / 32 6.2s Rwlock parallel [ ] 12 0 0 12 / 32 6.9s Rwlock parallel [ ] 13 0 0 13 / 32 7.4s Rwlock parallel [ ] 14 0 0 14 / 32 8.0s Rwlock parallel [ ] 15 0 0 15 / 32 8.2s Rwlock parallel [ ] 16 0 0 16 / 32 9.0s Rwlock parallel [ ] 17 0 0 17 / 32 9.7s Rwlock parallel [ ] 18 0 0 18 / 32 10.4s Rwlock parallel [ ] 19 0 0 19 / 32 10.8s Rwlock parallel [ ] 20 0 0 20 / 32 11.8s Rwlock parallel [ ] 21 0 0 21 / 32 12.7s Rwlock parallel [ ] 22 0 0 22 / 32 12.9s Rwlock parallel [ ] 23 0 0 23 / 32 14.0s Rwlock parallel [ ] 24 0 0 24 / 32 15.0s Rwlock parallel [ ] 25 0 0 25 / 32 15.9s Rwlock parallel [ ] 26 0 0 26 / 32 16.4s Rwlock parallel [ ] 27 0 0 27 / 32 17.6s Rwlock parallel [ ] 28 0 0 28 / 32 19.0s Rwlock parallel [ ] 29 0 0 29 / 32 19.8s Rwlock parallel [ ] 30 0 0 30 / 32 20.7s Rwlock parallel [ ] 31 0 0 31 / 32 21.7s Rwlock parallel [ ] 32 0 0 32 / 32 23.0s Rwlock parallel [✓] 32 0 0 32 / 32 23.0s Rwlock parallel ================================================================================ success (ran 2 tests) random seed: 1326712006108271528 generated error fail pass / total time test name [ ] 0 0 0 0 / 51 0.0s Rwlock sequential [✓] 51 0 0 51 / 51 0.0s Rwlock sequential [ ] 0 0 0 0 / 51 0.0s Rwlock parallel [ ] 1 0 0 1 / 51 1.4s Rwlock parallel [ ] 2 0 0 2 / 51 2.1s Rwlock parallel [ ] 3 0 0 3 / 51 2.9s Rwlock parallel [ ] 4 0 0 4 / 51 3.8s Rwlock parallel [ ] 5 0 0 5 / 51 4.7s Rwlock parallel [ ] 6 0 0 6 / 51 5.7s Rwlock parallel [ ] 7 0 0 7 / 51 6.2s Rwlock parallel [ ] 8 0 0 8 / 51 7.0s Rwlock parallel [ ] 9 0 0 9 / 51 8.2s Rwlock parallel [ ] 10 0 0 10 / 51 9.5s Rwlock parallel [ ] 11 0 0 11 / 51 10.0s Rwlock parallel [ ] 12 0 0 12 / 51 11.0s Rwlock parallel [ ] 13 0 0 13 / 51 11.5s Rwlock parallel [ ] 14 0 0 14 / 51 12.5s Rwlock parallel [ ] 15 0 0 15 / 51 13.5s Rwlock parallel [ ] 16 0 0 16 / 51 13.8s Rwlock parallel [ ] 18 0 0 18 / 51 14.2s Rwlock parallel [ ] 19 0 0 19 / 51 14.5s Rwlock parallel [ ] 20 0 0 20 / 51 14.8s Rwlock parallel [ ] 21 0 0 21 / 51 15.3s Rwlock parallel [ ] 22 0 0 22 / 51 15.9s Rwlock parallel [ ] 23 0 0 23 / 51 17.2s Rwlock parallel [ ] 24 0 0 24 / 51 18.4s Rwlock parallel [ ] 25 0 0 25 / 51 18.6s Rwlock parallel [ ] 26 0 0 26 / 51 19.6s Rwlock parallel [ ] 27 0 0 27 / 51 20.7s Rwlock parallel [ ] 28 0 0 28 / 51 21.6s Rwlock parallel [ ] 29 0 0 29 / 51 22.5s Rwlock parallel [ ] 30 0 0 30 / 51 23.5s Rwlock parallel [ ] 31 0 0 31 / 51 24.0s Rwlock parallel [ ] 32 0 0 32 / 51 25.1s Rwlock parallel [ ] 33 0 0 33 / 51 25.7s Rwlock parallel [ ] 34 0 0 34 / 51 26.2s Rwlock parallel [ ] 35 0 0 35 / 51 26.9s Rwlock parallel [ ] 36 0 0 36 / 51 27.2s Rwlock parallel [ ] 37 0 0 37 / 51 28.2s Rwlock parallel [ ] 38 0 0 38 / 51 30.0s Rwlock parallel [ ] 39 0 0 39 / 51 31.2s Rwlock parallel [ ] 40 0 0 40 / 51 32.5s Rwlock parallel [ ] 41 0 0 41 / 51 32.8s Rwlock parallel [ ] 42 0 0 42 / 51 33.1s Rwlock parallel [ ] 43 0 0 43 / 51 33.7s Rwlock parallel [ ] 44 0 0 44 / 51 34.1s Rwlock parallel [ ] 45 0 0 45 / 51 35.3s Rwlock parallel [ ] 46 0 0 46 / 51 36.6s Rwlock parallel [ ] 47 0 0 47 / 51 38.1s Rwlock parallel [ ] 48 0 0 48 / 51 39.7s Rwlock parallel [ ] 49 0 0 49 / 51 40.3s Rwlock parallel [ ] 50 0 0 50 / 51 41.1s Rwlock parallel [ ] 51 0 0 51 / 51 41.9s Rwlock parallel [✓] 51 0 0 51 / 51 41.9s Rwlock parallel ================================================================================ success (ran 2 tests) (cd _build/default/bench && .\main.exe -brief "Picos DLS") Picos DLS: gets over time/1 worker: 34.32 M/s gets over time/2 workers: 63.26 M/s gets over time/4 workers: 142.74 M/s sets over time/1 worker: 16.70 M/s sets over time/2 workers: 34.27 M/s sets over time/4 workers: 97.23 M/s time per get/1 worker: 29.13 ns time per get/2 workers: 31.62 ns time per get/4 workers: 28.02 ns time per set/1 worker: 59.87 ns time per set/2 workers: 58.37 ns time per set/4 workers: 41.14 ns (cd _build/default/test && .\test_sync.exe -- "^Rwlock and Rwlock.Condition$" 1) Testing `Picos_sync'. This run has ID `KSESNRTA'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [OK] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\KSESNRTA'. Test Successful in 2.221s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Rwlock and Rwlock.Condition$" 2) Testing `Picos_sync'. This run has ID `C4SY7XO3'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [OK] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\C4SY7XO3'. Test Successful in 0.018s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Rwlock and Rwlock.Condition$" 3) Testing `Picos_sync'. This run has ID `4WNW95YU'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [OK] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\4WNW95YU'. Test Successful in 0.000s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Rwlock and Rwlock.Condition$" 4) Testing `Picos_sync'. This run has ID `1SPZQDFQ'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [OK] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\1SPZQDFQ'. Test Successful in 0.013s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Rwlock and Rwlock.Condition$" 5) Testing `Picos_sync'. This run has ID `J7F7ZTW6'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [OK] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\J7F7ZTW6'. Test Successful in 0.073s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Rwlock and Rwlock.Condition$" 6) Testing `Picos_sync'. This run has ID `HM4D08JC'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [OK] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\HM4D08JC'. Test Successful in 0.000s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Semaphore$ 0) Testing `Picos_sync'. This run has ID `CFK21F9A'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [OK] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\CFK21F9A'. Test Successful in 0.016s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Semaphore$ 1) Testing `Picos_sync'. This run has ID `OEOK7SZN'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [OK] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\OEOK7SZN'. Test Successful in 0.359s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Sem$ 0) Testing `Picos_sync'. This run has ID `CNA8QBZB'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [OK] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\CNA8QBZB'. Test Successful in 0.000s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Sem$ 1) Testing `Picos_sync'. This run has ID `JANJVUKN'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [OK] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\JANJVUKN'. Test Successful in 0.174s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Sem$ 2) Testing `Picos_sync'. This run has ID `NEOK14DH'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [OK] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\NEOK14DH'. Test Successful in 0.014s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Sem$ 3) Testing `Picos_sync'. This run has ID `ET0RYLEB'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [OK] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\ET0RYLEB'. Test Successful in 0.000s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Lazy$ 0) Testing `Picos_sync'. This run has ID `Z7JENSIJ'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [OK] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\Z7JENSIJ'. Test Successful in 0.007s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Lazy$ 1) Testing `Picos_sync'. This run has ID `VOFDR6H0'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [OK] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\VOFDR6H0'. Test Successful in 0.046s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Event$ 0) Testing `Picos_sync'. This run has ID `2VG7E7T0'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [OK] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\2VG7E7T0'. Test Successful in 0.137s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Barrier$ 0) Testing `Picos_sync'. This run has ID `J8B4RNA4'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [OK] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\J8B4RNA4'. Test Successful in 0.167s. 1 test run. (cd _build/default/test && .\test_sync.exe -- ^Barrier$ 1) Testing `Picos_sync'. This run has ID `BIBQGAYX'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [OK] Barrier 1 poisoning. [SKIP] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\BIBQGAYX'. Test Successful in 0.188s. 1 test run. (cd _build/default/test && .\test_sync.exe -- "^Non-cancelable ops$" 0) Testing `Picos_sync'. This run has ID `EA1Q2J91'. [SKIP] Mutex and Condition 0 basics. [SKIP] Mutex and Condition 1 errors. [SKIP] Mutex and Condition 2 cancelation. [SKIP] Lock and Lock.Condition 0 basics. [SKIP] Lock and Lock.Condition 1 cancelation. [SKIP] Lock and Lock.Condition 2 poisoning. [SKIP] Lock and Lock.Condition 3 try_acquire. [SKIP] Rwlock and Rwlock.Condition 0 basics. [SKIP] Rwlock and Rwlock.Condition 1 cancelation. [SKIP] Rwlock and Rwlock.Condition 2 poisoning. [SKIP] Rwlock and Rwlock.Condition 3 freezing. [SKIP] Rwlock and Rwlock.Condition 4 try_acquire. [SKIP] Rwlock and Rwlock.Condition 5 try_acquire_shared. [SKIP] Rwlock and Rwlock.Condition 6 sharing. [SKIP] Semaphore 0 basics. [SKIP] Semaphore 1 stress. [SKIP] Sem 0 basics. [SKIP] Sem 1 stress. [SKIP] Sem 2 poisoning. [SKIP] Sem 3 try_acquire. [SKIP] Lazy 0 basics. [SKIP] Lazy 1 cancelation. [SKIP] Event 0 basics. [SKIP] Barrier 0 basics. [SKIP] Barrier 1 poisoning. [OK] Non-cancelable ops 0 are not canceled. Full test results in `~\src\_build\default\test\_build\_tests\EA1Q2J91'. Test Successful in 0.020s. 1 test run. (cd _build/default/test && .\test_io_cohttp.exe) Uri: //127.0.0.1:49700/hello-io-cohttp Method: POST host: 127.0.0.1:49700 user-agent: ocaml-cohttp/v6.1.1 content-length: 17 Body: It's-a-Me, Picos! (cd _build/default/test && .\test_mpscq.exe) random seed: 3681776551681044106 generated error fail pass / total time test name [ ] 0 0 0 0 / 32 0.0s Mpscq sequential [✓] 32 0 0 32 / 32 0.0s Mpscq sequential [ ] 0 0 0 0 / 32 0.0s Mpscq parallel [ ] 1 0 0 1 / 32 0.1s Mpscq parallel [ ] 2 0 0 2 / 32 0.2s Mpscq parallel [ ] 3 0 0 3 / 32 0.6s Mpscq parallel [ ] 4 0 0 4 / 32 1.1s Mpscq parallel [ ] 5 0 0 5 / 32 1.3s Mpscq parallel [ ] 7 0 0 7 / 32 1.5s Mpscq parallel [ ] 8 0 0 8 / 32 1.6s Mpscq parallel [ ] 9 0 0 9 / 32 1.8s Mpscq parallel [ ] 11 0 0 11 / 32 2.0s Mpscq parallel [ ] 12 0 0 12 / 32 2.4s Mpscq parallel [ ] 15 0 0 15 / 32 2.5s Mpscq parallel [ ] 16 0 0 16 / 32 2.7s Mpscq parallel [ ] 18 0 0 18 / 32 2.9s Mpscq parallel [ ] 19 0 0 19 / 32 3.2s Mpscq parallel [ ] 20 0 0 20 / 32 3.4s Mpscq parallel [ ] 22 0 0 22 / 32 3.8s Mpscq parallel [ ] 23 0 0 23 / 32 4.0s Mpscq parallel [ ] 25 0 0 25 / 32 4.3s Mpscq parallel [ ] 26 0 0 26 / 32 4.5s Mpscq parallel [ ] 28 0 0 28 / 32 4.7s Mpscq parallel [ ] 30 0 0 30 / 32 4.8s Mpscq parallel [ ] 32 0 0 32 / 32 5.0s Mpscq parallel [✓] 32 0 0 32 / 32 5.0s Mpscq parallel ================================================================================ success (ran 2 tests) random seed: 699992769357180333 generated error fail pass / total time test name [ ] 0 0 0 0 / 64 0.0s Mpscq sequential [✓] 64 0 0 64 / 64 0.0s Mpscq sequential [ ] 0 0 0 0 / 64 0.0s Mpscq parallel [ ] 2 0 0 2 / 64 0.1s Mpscq parallel [ ] 4 0 0 4 / 64 0.3s Mpscq parallel [ ] 6 0 0 6 / 64 0.4s Mpscq parallel [ ] 8 0 0 8 / 64 0.7s Mpscq parallel [ ] 9 0 0 9 / 64 0.9s Mpscq parallel [ ] 10 0 0 10 / 64 1.3s Mpscq parallel [ ] 12 0 0 12 / 64 1.4s Mpscq parallel [ ] 14 0 0 14 / 64 1.6s Mpscq parallel [ ] 16 0 0 16 / 64 1.8s Mpscq parallel [ ] 18 0 0 18 / 64 2.0s Mpscq parallel [ ] 20 0 0 20 / 64 2.1s Mpscq parallel [ ] 21 0 0 21 / 64 2.3s Mpscq parallel [ ] 22 0 0 22 / 64 2.4s Mpscq parallel [ ] 23 0 0 23 / 64 2.7s Mpscq parallel [ ] 25 0 0 25 / 64 3.0s Mpscq parallel [ ] 26 0 0 26 / 64 3.2s Mpscq parallel [ ] 28 0 0 28 / 64 3.4s Mpscq parallel [ ] 29 0 0 29 / 64 3.5s Mpscq parallel [ ] 30 0 0 30 / 64 3.7s Mpscq parallel [ ] 31 0 0 31 / 64 3.9s Mpscq parallel [ ] 34 0 0 34 / 64 4.1s Mpscq parallel [ ] 36 0 0 36 / 64 4.3s Mpscq parallel [ ] 37 0 0 37 / 64 4.4s Mpscq parallel [ ] 39 0 0 39 / 64 4.6s Mpscq parallel [ ] 40 0 0 40 / 64 4.8s Mpscq parallel [ ] 41 0 0 41 / 64 5.2s Mpscq parallel [ ] 44 0 0 44 / 64 5.3s Mpscq parallel [ ] 45 0 0 45 / 64 5.4s Mpscq parallel [ ] 46 0 0 46 / 64 5.6s Mpscq parallel [ ] 49 0 0 49 / 64 5.7s Mpscq parallel [ ] 50 0 0 50 / 64 5.9s Mpscq parallel [ ] 54 0 0 54 / 64 6.0s Mpscq parallel [ ] 57 0 0 57 / 64 6.1s Mpscq parallel [ ] 58 0 0 58 / 64 6.3s Mpscq parallel [ ] 63 0 0 63 / 64 6.4s Mpscq parallel [✓] 64 0 0 64 / 64 6.5s Mpscq parallel ================================================================================ success (ran 2 tests) random seed: 3852510321555618218 generated error fail pass / total time test name [ ] 0 0 0 0 / 128 0.0s Mpscq sequential [✓] 128 0 0 128 / 128 0.0s Mpscq sequential [ ] 0 0 0 0 / 128 0.0s Mpscq parallel [ ] 3 0 0 3 / 128 0.1s Mpscq parallel [ ] 4 0 0 4 / 128 0.5s Mpscq parallel [ ] 6 0 0 6 / 128 0.6s Mpscq parallel [ ] 8 0 0 8 / 128 0.8s Mpscq parallel [ ] 10 0 0 10 / 128 1.0s Mpscq parallel [ ] 13 0 0 13 / 128 1.3s Mpscq parallel [ ] 15 0 0 15 / 128 1.5s Mpscq parallel [ ] 18 0 0 18 / 128 1.7s Mpscq parallel [ ] 19 0 0 19 / 128 1.9s Mpscq parallel [ ] 22 0 0 22 / 128 2.0s Mpscq parallel [ ] 25 0 0 25 / 128 2.2s Mpscq parallel [ ] 28 0 0 28 / 128 2.5s Mpscq parallel [ ] 29 0 0 29 / 128 2.7s Mpscq parallel [ ] 33 0 0 33 / 128 2.8s Mpscq parallel [ ] 35 0 0 35 / 128 3.0s Mpscq parallel [ ] 39 0 0 39 / 128 3.1s Mpscq parallel [ ] 43 0 0 43 / 128 3.2s Mpscq parallel [ ] 47 0 0 47 / 128 3.3s Mpscq parallel [ ] 51 0 0 51 / 128 3.4s Mpscq parallel [ ] 55 0 0 55 / 128 3.6s Mpscq parallel [ ] 56 0 0 56 / 128 3.8s Mpscq parallel [ ] 57 0 0 57 / 128 4.0s Mpscq parallel [ ] 60 0 0 60 / 128 4.1s Mpscq parallel [ ] 62 0 0 62 / 128 4.3s Mpscq parallel [ ] 64 0 0 64 / 128 4.4s Mpscq parallel [ ] 65 0 0 65 / 128 4.7s Mpscq parallel [ ] 67 0 0 67 / 128 4.9s Mpscq parallel [ ] 68 0 0 68 / 128 5.0s Mpscq parallel [ ] 70 0 0 70 / 128 5.5s Mpscq parallel [ ] 71 0 0 71 / 128 5.6s Mpscq parallel [ ] 72 0 0 72 / 128 5.7s Mpscq parallel [ ] 73 0 0 73 / 128 5.9s Mpscq parallel [ ] 74 0 0 74 / 128 6.1s Mpscq parallel [ ] 75 0 0 75 / 128 6.2s Mpscq parallel [ ] 76 0 0 76 / 128 6.5s Mpscq parallel [ ] 79 0 0 79 / 128 6.6s Mpscq parallel [ ] 81 0 0 81 / 128 6.8s Mpscq parallel [ ] 83 0 0 83 / 128 6.9s Mpscq parallel [ ] 84 0 0 84 / 128 7.0s Mpscq parallel [ ] 88 0 0 88 / 128 7.2s Mpscq parallel [ ] 91 0 0 91 / 128 7.3s Mpscq parallel [ ] 94 0 0 94 / 128 7.5s Mpscq parallel [ ] 97 0 0 97 / 128 7.9s Mpscq parallel [ ] 98 0 0 98 / 128 8.1s Mpscq parallel [ ] 99 0 0 99 / 128 8.5s Mpscq parallel [ ] 100 0 0 100 / 128 9.3s Mpscq parallel [ ] 101 0 0 101 / 128 9.7s Mpscq parallel [ ] 102 0 0 102 / 128 10.3s Mpscq parallel [ ] 103 0 0 103 / 128 10.4s Mpscq parallel [ ] 106 0 0 106 / 128 10.5s Mpscq parallel [ ] 108 0 0 108 / 128 10.9s Mpscq parallel [ ] 109 0 0 109 / 128 11.1s Mpscq parallel [ ] 110 0 0 110 / 128 11.2s Mpscq parallel [ ] 113 0 0 113 / 128 11.3s Mpscq parallel [ ] 114 0 0 114 / 128 11.9s Mpscq parallel [ ] 115 0 0 115 / 128 12.0s Mpscq parallel [ ] 116 0 0 116 / 128 12.2s Mpscq parallel [ ] 117 0 0 117 / 128 12.4s Mpscq parallel [ ] 118 0 0 118 / 128 12.6s Mpscq parallel [ ] 119 0 0 119 / 128 13.4s Mpscq parallel [ ] 120 0 0 120 / 128 13.6s Mpscq parallel [ ] 121 0 0 121 / 128 14.4s Mpscq parallel [ ] 123 0 0 123 / 128 14.6s Mpscq parallel [ ] 126 0 0 126 / 128 14.9s Mpscq parallel [ ] 127 0 0 127 / 128 15.1s Mpscq parallel [ ] 128 0 0 128 / 128 15.5s Mpscq parallel [✓] 128 0 0 128 / 128 15.5s Mpscq parallel ================================================================================ success (ran 2 tests) random seed: 2941282081452148869 generated error fail pass / total time test name [ ] 0 0 0 0 / 256 0.0s Mpscq sequential [✓] 256 0 0 256 / 256 0.0s Mpscq sequential [ ] 0 0 0 0 / 256 0.0s Mpscq parallel [ ] 1 0 0 1 / 256 0.5s Mpscq parallel [ ] 2 0 0 2 / 256 1.1s Mpscq parallel [ ] 3 0 0 3 / 256 1.9s Mpscq parallel [ ] 4 0 0 4 / 256 2.5s Mpscq parallel [ ] 5 0 0 5 / 256 2.7s Mpscq parallel [ ] 6 0 0 6 / 256 3.3s Mpscq parallel [ ] 7 0 0 7 / 256 3.7s Mpscq parallel [ ] 8 0 0 8 / 256 4.2s Mpscq parallel [ ] 9 0 0 9 / 256 4.6s Mpscq parallel [ ] 10 0 0 10 / 256 5.0s Mpscq parallel [ ] 11 0 0 11 / 256 5.3s Mpscq parallel [ ] 12 0 0 12 / 256 5.8s Mpscq parallel [ ] 13 0 0 13 / 256 6.1s Mpscq parallel [ ] 14 0 0 14 / 256 6.4s Mpscq parallel [ ] 15 0 0 15 / 256 6.9s Mpscq parallel [ ] 16 0 0 16 / 256 7.1s Mpscq parallel [ ] 17 0 0 17 / 256 7.6s Mpscq parallel [ ] 18 0 0 18 / 256 8.1s Mpscq parallel [ ] 19 0 0 19 / 256 8.4s Mpscq parallel [ ] 20 0 0 20 / 256 8.7s Mpscq parallel [ ] 21 0 0 21 / 256 9.2s Mpscq parallel [ ] 22 0 0 22 / 256 9.4s Mpscq parallel [ ] 23 0 0 23 / 256 9.5s Mpscq parallel [ ] 24 0 0 24 / 256 10.0s Mpscq parallel [ ] 25 0 0 25 / 256 10.3s Mpscq parallel [ ] 26 0 0 26 / 256 11.8s Mpscq parallel [ ] 27 0 0 27 / 256 12.8s Mpscq parallel [ ] 28 0 0 28 / 256 13.2s Mpscq parallel [ ] 29 0 0 29 / 256 13.5s Mpscq parallel [ ] 30 0 0 30 / 256 13.6s Mpscq parallel [ ] 31 0 0 31 / 256 14.6s Mpscq parallel [ ] 32 0 0 32 / 256 14.9s Mpscq parallel [ ] 33 0 0 33 / 256 15.2s Mpscq parallel [ ] 34 0 0 34 / 256 16.3s Mpscq parallel [ ] 35 0 0 35 / 256 16.7s Mpscq parallel [ ] 36 0 0 36 / 256 17.2s Mpscq parallel [ ] 37 0 0 37 / 256 17.7s Mpscq parallel [ ] 38 0 0 38 / 256 18.9s Mpscq parallel [ ] 39 0 0 39 / 256 19.7s Mpscq parallel [ ] 40 0 0 40 / 256 20.0s Mpscq parallel [ ] 43 0 0 43 / 256 20.4s Mpscq parallel [ ] 44 0 0 44 / 256 20.8s Mpscq parallel [ ] 45 0 0 45 / 256 21.1s Mpscq parallel [ ] 46 0 0 46 / 256 21.2s Mpscq parallel [ ] 47 0 0 47 / 256 21.9s Mpscq parallel [ ] 48 0 0 48 / 256 23.3s Mpscq parallel [ ] 49 0 0 49 / 256 24.2s Mpscq parallel [ ] 50 0 0 50 / 256 25.1s Mpscq parallel [ ] 51 0 0 51 / 256 25.3s Mpscq parallel [ ] 52 0 0 52 / 256 25.6s Mpscq parallel [ ] 53 0 0 53 / 256 26.2s Mpscq parallel [ ] 54 0 0 54 / 256 26.9s Mpscq parallel [ ] 55 0 0 55 / 256 27.2s Mpscq parallel [ ] 56 0 0 56 / 256 27.4s Mpscq parallel [ ] 57 0 0 57 / 256 27.5s Mpscq parallel [ ] 58 0 0 58 / 256 27.7s Mpscq parallel [ ] 59 0 0 59 / 256 27.8s Mpscq parallel [ ] 60 0 0 60 / 256 28.0s Mpscq parallel [ ] 61 0 0 61 / 256 28.4s Mpscq parallel [ ] 62 0 0 62 / 256 28.6s Mpscq parallel [ ] 63 0 0 63 / 256 28.9s Mpscq parallel [ ] 64 0 0 64 / 256 29.2s Mpscq parallel [ ] 65 0 0 65 / 256 29.5s Mpscq parallel [ ] 66 0 0 66 / 256 29.9s Mpscq parallel [ ] 67 0 0 67 / 256 30.3s Mpscq parallel [ ] 68 0 0 68 / 256 30.6s Mpscq parallel [ ] 69 0 0 69 / 256 31.0s Mpscq parallel [ ] 70 0 0 70 / 256 31.1s Mpscq parallel [ ] 71 0 0 71 / 256 31.4s Mpscq parallel [ ] 72 0 0 72 / 256 31.7s Mpscq parallel [ ] 73 0 0 73 / 256 32.0s Mpscq parallel [ ] 74 0 0 74 / 256 32.2s Mpscq parallel [ ] 75 0 0 75 / 256 32.8s Mpscq parallel [ ] 76 0 0 76 / 256 33.1s Mpscq parallel [ ] 77 0 0 77 / 256 33.6s Mpscq parallel [ ] 78 0 0 78 / 256 34.5s Mpscq parallel [ ] 79 0 0 79 / 256 34.7s Mpscq parallel [ ] 80 0 0 80 / 256 35.0s Mpscq parallel [ ] 81 0 0 81 / 256 35.5s Mpscq parallel [ ] 82 0 0 82 / 256 35.8s Mpscq parallel [ ] 83 0 0 83 / 256 36.6s Mpscq parallel [ ] 84 0 0 84 / 256 37.1s Mpscq parallel [ ] 85 0 0 85 / 256 38.1s Mpscq parallel [ ] 86 0 0 86 / 256 39.1s Mpscq parallel [ ] 87 0 0 87 / 256 39.9s Mpscq parallel [ ] 88 0 0 88 / 256 40.4s Mpscq parallel [ ] 89 0 0 89 / 256 41.0s Mpscq parallel [ ] 90 0 0 90 / 256 41.3s Mpscq parallel [ ] 91 0 0 91 / 256 42.3s Mpscq parallel [ ] 92 0 0 92 / 256 42.7s Mpscq parallel [ ] 93 0 0 93 / 256 43.5s Mpscq parallel [ ] 94 0 0 94 / 256 44.0s Mpscq parallel [ ] 95 0 0 95 / 256 44.4s Mpscq parallel [ ] 96 0 0 96 / 256 44.9s Mpscq parallel [ ] 97 0 0 97 / 256 45.8s Mpscq parallel [ ] 98 0 0 98 / 256 46.9s Mpscq parallel [ ] 99 0 0 99 / 256 47.9s Mpscq parallel [ ] 100 0 0 100 / 256 48.2s Mpscq parallel [ ] 101 0 0 101 / 256 49.4s Mpscq parallel [ ] 103 0 0 103 / 256 49.9s Mpscq parallel [ ] 104 0 0 104 / 256 50.0s Mpscq parallel [ ] 105 0 0 105 / 256 50.1s Mpscq parallel [ ] 106 0 0 106 / 256 51.0s Mpscq parallel [ ] 107 0 0 107 / 256 51.1s Mpscq parallel [ ] 108 0 0 108 / 256 51.5s Mpscq parallel [ ] 109 0 0 109 / 256 52.2s Mpscq parallel [ ] 111 0 0 111 / 256 52.5s Mpscq parallel [ ] 112 0 0 112 / 256 52.8s Mpscq parallel [ ] 112 0 0 112 / 256 53.1s Mpscq parallel (generating) [ ] 113 0 0 113 / 256 53.6s Mpscq parallel [ ] 114 0 0 114 / 256 54.0s Mpscq parallel [ ] 115 0 0 115 / 256 54.1s Mpscq parallel [ ] 116 0 0 116 / 256 54.3s Mpscq parallel [ ] 117 0 0 117 / 256 54.5s Mpscq parallel [ ] 118 0 0 118 / 256 55.1s Mpscq parallel [ ] 119 0 0 119 / 256 56.0s Mpscq parallel [ ] 120 0 0 120 / 256 56.9s Mpscq parallel [ ] 121 0 0 121 / 256 57.2s Mpscq parallel [ ] 122 0 0 122 / 256 57.7s Mpscq parallel [ ] 123 0 0 123 / 256 58.4s Mpscq parallel [ ] 124 0 0 124 / 256 59.0s Mpscq parallel [ ] 125 0 0 125 / 256 60.0s Mpscq parallel [ ] 126 0 0 126 / 256 60.4s Mpscq parallel [ ] 127 0 0 127 / 256 60.5s Mpscq parallel [ ] 128 0 0 128 / 256 61.5s Mpscq parallel [ ] 129 0 0 129 / 256 62.5s Mpscq parallel [ ] 130 0 0 130 / 256 62.8s Mpscq parallel [ ] 131 0 0 131 / 256 63.3s Mpscq parallel [ ] 132 0 0 132 / 256 63.9s Mpscq parallel [ ] 133 0 0 133 / 256 64.6s Mpscq parallel [ ] 134 0 0 134 / 256 65.3s Mpscq parallel [ ] 135 0 0 135 / 256 66.5s Mpscq parallel [ ] 136 0 0 136 / 256 67.4s Mpscq parallel [ ] 137 0 0 137 / 256 68.0s Mpscq parallel [ ] 138 0 0 138 / 256 69.1s Mpscq parallel [ ] 139 0 0 139 / 256 70.1s Mpscq parallel [ ] 140 0 0 140 / 256 71.0s Mpscq parallel [ ] 141 0 0 141 / 256 71.5s Mpscq parallel [ ] 142 0 0 142 / 256 72.6s Mpscq parallel [ ] 143 0 0 143 / 256 74.0s Mpscq parallel [ ] 144 0 0 144 / 256 75.0s Mpscq parallel [ ] 145 0 0 145 / 256 75.9s Mpscq parallel [ ] 146 0 0 146 / 256 77.0s Mpscq parallel [ ] 147 0 0 147 / 256 78.5s Mpscq parallel [ ] 148 0 0 148 / 256 79.8s Mpscq parallel [ ] 149 0 0 149 / 256 80.3s Mpscq parallel [ ] 150 0 0 150 / 256 81.3s Mpscq parallel [ ] 151 0 0 151 / 256 81.6s Mpscq parallel [ ] 152 0 0 152 / 256 82.9s Mpscq parallel [ ] 153 0 0 153 / 256 83.7s Mpscq parallel [ ] 154 0 0 154 / 256 84.2s Mpscq parallel [ ] 155 0 0 155 / 256 85.1s Mpscq parallel [ ] 156 0 0 156 / 256 86.3s Mpscq parallel [ ] 157 0 0 157 / 256 87.4s Mpscq parallel [ ] 158 0 0 158 / 256 88.5s Mpscq parallel [ ] 159 0 0 159 / 256 89.6s Mpscq parallel [ ] 160 0 0 160 / 256 90.6s Mpscq parallel [ ] 161 0 0 161 / 256 91.7s Mpscq parallel [ ] 162 0 0 162 / 256 91.9s Mpscq parallel [ ] 163 0 0 163 / 256 92.1s Mpscq parallel [ ] 164 0 0 164 / 256 92.9s Mpscq parallel [ ] 165 0 0 165 / 256 93.5s Mpscq parallel [ ] 166 0 0 166 / 256 94.8s Mpscq parallel [ ] 167 0 0 167 / 256 96.4s Mpscq parallel [ ] 168 0 0 168 / 256 96.6s Mpscq parallel [ ] 169 0 0 169 / 256 97.4s Mpscq parallel [ ] 170 0 0 170 / 256 98.6s Mpscq parallel [ ] 171 0 0 171 / 256 99.6s Mpscq parallel [ ] 172 0 0 172 / 256 100.3s Mpscq parallel [ ] 173 0 0 173 / 256 101.5s Mpscq parallel [ ] 174 0 0 174 / 256 102.0s Mpscq parallel [ ] 175 0 0 175 / 256 102.8s Mpscq parallel [ ] 176 0 0 176 / 256 103.7s Mpscq parallel [ ] 177 0 0 177 / 256 104.5s Mpscq parallel [ ] 178 0 0 178 / 256 105.0s Mpscq parallel [ ] 179 0 0 179 / 256 105.6s Mpscq parallel [ ] 180 0 0 180 / 256 107.5s Mpscq parallel [ ] 181 0 0 181 / 256 108.9s Mpscq parallel [ ] 182 0 0 182 / 256 110.3s Mpscq parallel [ ] 183 0 0 183 / 256 110.8s Mpscq parallel [ ] 184 0 0 184 / 256 111.1s Mpscq parallel [ ] 185 0 0 185 / 256 111.9s Mpscq parallel [ ] 186 0 0 186 / 256 112.8s Mpscq parallel [ ] 187 0 0 187 / 256 114.4s Mpscq parallel [ ] 188 0 0 188 / 256 115.3s Mpscq parallel [ ] 189 0 0 189 / 256 117.0s Mpscq parallel [ ] 190 0 0 190 / 256 118.3s Mpscq parallel [ ] 191 0 0 191 / 256 119.0s Mpscq parallel [ ] 192 0 0 192 / 256 119.9s Mpscq parallel [ ] 192 0 0 192 / 256 120.0s Mpscq parallel (generating) [ ] 193 0 0 193 / 256 120.4s Mpscq parallel [ ] 194 0 0 194 / 256 121.0s Mpscq parallel [ ] 195 0 0 195 / 256 121.6s Mpscq parallel [ ] 196 0 0 196 / 256 122.5s Mpscq parallel [ ] 197 0 0 197 / 256 123.0s Mpscq parallel [ ] 198 0 0 198 / 256 123.3s Mpscq parallel [ ] 199 0 0 199 / 256 123.7s Mpscq parallel [ ] 200 0 0 200 / 256 124.2s Mpscq parallel [ ] 201 0 0 201 / 256 124.5s Mpscq parallel [ ] 202 0 0 202 / 256 124.9s Mpscq parallel [ ] 203 0 0 203 / 256 125.4s Mpscq parallel [ ] 204 0 0 204 / 256 125.8s Mpscq parallel [ ] 205 0 0 205 / 256 126.3s Mpscq parallel [ ] 206 0 0 206 / 256 126.9s Mpscq parallel [ ] 207 0 0 207 / 256 128.0s Mpscq parallel [ ] 208 0 0 208 / 256 128.6s Mpscq parallel [ ] 209 0 0 209 / 256 129.6s Mpscq parallel [ ] 210 0 0 210 / 256 130.5s Mpscq parallel [ ] 211 0 0 211 / 256 131.8s Mpscq parallel [ ] 212 0 0 212 / 256 132.1s Mpscq parallel [ ] 213 0 0 213 / 256 132.4s Mpscq parallel [ ] 214 0 0 214 / 256 132.6s Mpscq parallel [ ] 215 0 0 215 / 256 133.2s Mpscq parallel [ ] 216 0 0 216 / 256 133.5s Mpscq parallel [ ] 217 0 0 217 / 256 133.8s Mpscq parallel [ ] 218 0 0 218 / 256 135.7s Mpscq parallel [ ] 219 0 0 219 / 256 136.1s Mpscq parallel [ ] 220 0 0 220 / 256 136.5s Mpscq parallel [ ] 221 0 0 221 / 256 137.0s Mpscq parallel [ ] 222 0 0 222 / 256 137.8s Mpscq parallel [ ] 223 0 0 223 / 256 138.1s Mpscq parallel [ ] 224 0 0 224 / 256 138.5s Mpscq parallel [ ] 225 0 0 225 / 256 138.9s Mpscq parallel [ ] 226 0 0 226 / 256 139.3s Mpscq parallel [ ] 227 0 0 227 / 256 139.5s Mpscq parallel [ ] 228 0 0 228 / 256 139.8s Mpscq parallel [ ] 229 0 0 229 / 256 140.1s Mpscq parallel [ ] 230 0 0 230 / 256 140.3s Mpscq parallel [ ] 231 0 0 231 / 256 140.6s Mpscq parallel [ ] 232 0 0 232 / 256 140.9s Mpscq parallel [ ] 233 0 0 233 / 256 141.1s Mpscq parallel [ ] 234 0 0 234 / 256 141.4s Mpscq parallel [ ] 235 0 0 235 / 256 142.3s Mpscq parallel [ ] 236 0 0 236 / 256 142.5s Mpscq parallel [ ] 237 0 0 237 / 256 142.8s Mpscq parallel [ ] 238 0 0 238 / 256 143.0s Mpscq parallel [ ] 239 0 0 239 / 256 143.3s Mpscq parallel [ ] 240 0 0 240 / 256 143.7s Mpscq parallel [ ] 241 0 0 241 / 256 144.0s Mpscq parallel [ ] 242 0 0 242 / 256 144.2s Mpscq parallel [ ] 243 0 0 243 / 256 144.5s Mpscq parallel [ ] 244 0 0 244 / 256 144.8s Mpscq parallel [ ] 245 0 0 245 / 256 145.1s Mpscq parallel [ ] 246 0 0 246 / 256 145.6s Mpscq parallel [ ] 247 0 0 247 / 256 146.6s Mpscq parallel [ ] 248 0 0 248 / 256 146.9s Mpscq parallel [ ] 249 0 0 249 / 256 147.1s Mpscq parallel [ ] 250 0 0 250 / 256 147.4s Mpscq parallel [ ] 251 0 0 251 / 256 147.8s Mpscq parallel [ ] 252 0 0 252 / 256 148.2s Mpscq parallel [ ] 253 0 0 253 / 256 148.5s Mpscq parallel [ ] 254 0 0 254 / 256 148.9s Mpscq parallel [ ] 255 0 0 255 / 256 149.3s Mpscq parallel [ ] 256 0 0 256 / 256 151.0s Mpscq parallel [✓] 256 0 0 256 / 256 151.0s Mpscq parallel ================================================================================ success (ran 2 tests) (cd _build/default/bench && .\main.exe -brief "Yield with Picos_std_sync") Yield with Picos_std_sync: locked yields over time/1024 fibers with Lock: 0.12 M/s locked yields over time/1024 fibers with Rwlock: 0.19 M/s locked yields over time/1024 fibers with Sem: 0.28 M/s locked yields over time/1024 fibers with Sem 2: 0.43 M/s locked yields over time/1024 fibers with Sem 3: 0.33 M/s locked yields over time/1024 fibers with Sem 4: 0.35 M/s locked yields over time/1 fiber with Lock: 1.27 M/s locked yields over time/1 fiber with Rwlock: 0.67 M/s locked yields over time/1 fiber with Sem: 1.03 M/s locked yields over time/1 fiber with Sem 2: 1.14 M/s locked yields over time/1 fiber with Sem 3: 1.33 M/s locked yields over time/1 fiber with Sem 4: 1.27 M/s locked yields over time/256 fibers with Lock: 0.17 M/s locked yields over time/256 fibers with Rwlock: 0.26 M/s locked yields over time/256 fibers with Sem: 0.25 M/s locked yields over time/256 fibers with Sem 2: 0.38 M/s locked yields over time/256 fibers with Sem 3: 0.47 M/s locked yields over time/256 fibers with Sem 4: 0.62 M/s locked yields over time/2 domains with Lock: 0.44 M/s locked yields over time/2 domains with Rwlock: 0.89 M/s locked yields over time/2 domains with Sem: 0.45 M/s locked yields over time/2 domains with Sem 2: 3.11 M/s locked yields over time/2 domains with Sem 3: 1.76 M/s locked yields over time/2 domains with Sem 4: 1.19 M/s locked yields over time/2 fibers with Lock: 0.32 M/s locked yields over time/2 fibers with Rwlock: 0.11 M/s locked yields over time/2 fibers with Sem: 0.34 M/s locked yields over time/2 fibers with Sem 2: 1.10 M/s locked yields over time/2 fibers with Sem 3: 1.15 M/s locked yields over time/2 fibers with Sem 4: 1.57 M/s locked yields over time/3 domains with Lock: 0.58 M/s locked yields over time/3 domains with Rwlock: 0.43 M/s locked yields over time/3 domains with Sem: 0.55 M/s locked yields over time/3 domains with Sem 2: 0.67 M/s locked yields over time/3 domains with Sem 3: 2.06 M/s locked yields over time/3 domains with Sem 4: 1.19 M/s locked yields over time/3 fibers with Lock: 0.22 M/s locked yields over time/3 fibers with Rwlock: 0.16 M/s locked yields over time/3 fibers with Sem: 0.23 M/s locked yields over time/3 fibers with Sem 2: 0.53 M/s locked yields over time/3 fibers with Sem 3: 1.42 M/s locked yields over time/3 fibers with Sem 4: 1.67 M/s locked yields over time/4 domains with Lock: 0.54 M/s locked yields over time/4 domains with Rwlock: 0.54 M/s locked yields over time/4 domains with Sem: 1.10 M/s locked yields over time/4 domains with Sem 2: 1.57 M/s locked yields over time/4 domains with Sem 3: 3.02 M/s locked yields over time/4 domains with Sem 4: 1.07 M/s locked yields over time/4 fibers with Lock: 0.30 M/s locked yields over time/4 fibers with Rwlock: 0.23 M/s locked yields over time/4 fibers with Sem: 0.24 M/s locked yields over time/4 fibers with Sem 2: 0.59 M/s locked yields over time/4 fibers with Sem 3: 0.67 M/s locked yields over time/4 fibers with Sem 4: 1.19 M/s locked yields over time/512 fibers with Lock: 0.20 M/s locked yields over time/512 fibers with Rwlock: 0.25 M/s locked yields over time/512 fibers with Sem: 0.29 M/s locked yields over time/512 fibers with Sem 2: 0.45 M/s locked yields over time/512 fibers with Sem 3: 0.45 M/s locked yields over time/512 fibers with Sem 4: 0.45 M/s locked yields over time/8 domains with Lock: 0.08 M/s locked yields over time/8 domains with Rwlock: 0.62 M/s locked yields over time/8 domains with Sem: 0.18 M/s locked yields over time/8 domains with Sem 2: 0.96 M/s locked yields over time/8 domains with Sem 3: 0.97 M/s locked yields over time/8 domains with Sem 4: 1.91 M/s locked yields over time/8 fibers with Lock: 0.21 M/s locked yields over time/8 fibers with Rwlock: 0.21 M/s locked yields over time/8 fibers with Sem: 0.27 M/s locked yields over time/8 fibers with Sem 2: 0.56 M/s locked yields over time/8 fibers with Sem 3: 0.62 M/s locked yields over time/8 fibers with Sem 4: 0.80 M/s time per locked yield/1024 fibers with Lock: 8523.49 ns time per locked yield/1024 fibers with Rwlock: 5154.95 ns time per locked yield/1024 fibers with Sem: 3601.22 ns time per locked yield/1024 fibers with Sem 2: 2337.31 ns time per locked yield/1024 fibers with Sem 3: 3058.41 ns time per locked yield/1024 fibers with Sem 4: 2847.42 ns time per locked yield/1 fiber with Lock: 789.82 ns time per locked yield/1 fiber with Rwlock: 1493.79 ns time per locked yield/1 fiber with Sem: 969.72 ns time per locked yield/1 fiber with Sem 2: 878.46 ns time per locked yield/1 fiber with Sem 3: 751.95 ns time per locked yield/1 fiber with Sem 4: 786.09 ns time per locked yield/256 fibers with Lock: 5736.11 ns time per locked yield/256 fibers with Rwlock: 3806.57 ns time per locked yield/256 fibers with Sem: 4038.43 ns time per locked yield/256 fibers with Sem 2: 2646.29 ns time per locked yield/256 fibers with Sem 3: 2142.09 ns time per locked yield/256 fibers with Sem 4: 1616.60 ns time per locked yield/2 domains with Lock: 4593.55 ns time per locked yield/2 domains with Rwlock: 2236.51 ns time per locked yield/2 domains with Sem: 4420.41 ns time per locked yield/2 domains with Sem 2: 643.20 ns time per locked yield/2 domains with Sem 3: 1134.92 ns time per locked yield/2 domains with Sem 4: 1686.70 ns time per locked yield/2 fibers with Lock: 3114.16 ns time per locked yield/2 fibers with Rwlock: 9409.16 ns time per locked yield/2 fibers with Sem: 2920.99 ns time per locked yield/2 fibers with Sem 2: 911.35 ns time per locked yield/2 fibers with Sem 3: 868.39 ns time per locked yield/2 fibers with Sem 4: 638.81 ns time per locked yield/3 domains with Lock: 5179.03 ns time per locked yield/3 domains with Rwlock: 7052.57 ns time per locked yield/3 domains with Sem: 5487.49 ns time per locked yield/3 domains with Sem 2: 4502.58 ns time per locked yield/3 domains with Sem 3: 1454.90 ns time per locked yield/3 domains with Sem 4: 2526.85 ns time per locked yield/3 fibers with Lock: 4515.60 ns time per locked yield/3 fibers with Rwlock: 6235.98 ns time per locked yield/3 fibers with Sem: 4279.45 ns time per locked yield/3 fibers with Sem 2: 1897.01 ns time per locked yield/3 fibers with Sem 3: 702.95 ns time per locked yield/3 fibers with Sem 4: 600.16 ns time per locked yield/4 domains with Lock: 7466.12 ns time per locked yield/4 domains with Rwlock: 7404.46 ns time per locked yield/4 domains with Sem: 3626.59 ns time per locked yield/4 domains with Sem 2: 2539.77 ns time per locked yield/4 domains with Sem 3: 1322.82 ns time per locked yield/4 domains with Sem 4: 3725.32 ns time per locked yield/4 fibers with Lock: 3380.68 ns time per locked yield/4 fibers with Rwlock: 4379.25 ns time per locked yield/4 fibers with Sem: 4167.27 ns time per locked yield/4 fibers with Sem 2: 1690.91 ns time per locked yield/4 fibers with Sem 3: 1482.87 ns time per locked yield/4 fibers with Sem 4: 838.06 ns time per locked yield/512 fibers with Lock: 4997.41 ns time per locked yield/512 fibers with Rwlock: 4004.76 ns time per locked yield/512 fibers with Sem: 3392.11 ns time per locked yield/512 fibers with Sem 2: 2233.00 ns time per locked yield/512 fibers with Sem 3: 2211.20 ns time per locked yield/512 fibers with Sem 4: 2226.38 ns time per locked yield/8 domains with Lock: 101934.13 ns time per locked yield/8 domains with Rwlock: 12993.38 ns time per locked yield/8 domains with Sem: 43607.70 ns time per locked yield/8 domains with Sem 2: 8340.76 ns time per locked yield/8 domains with Sem 3: 8206.24 ns time per locked yield/8 domains with Sem 4: 4192.65 ns time per locked yield/8 fibers with Lock: 4726.74 ns time per locked yield/8 fibers with Rwlock: 4713.23 ns time per locked yield/8 fibers with Sem: 3665.10 ns time per locked yield/8 fibers with Sem 2: 1771.41 ns time per locked yield/8 fibers with Sem 3: 1608.80 ns time per locked yield/8 fibers with Sem 4: 1257.04 ns (cd _build/default/bench && .\main.exe -brief "Picos Spawn") Picos Spawn: spawns over time/with packed computation: 1.02 M/s time per spawn/with packed computation: 978.31 ns (cd _build/default/test && .\test_schedulers.exe) Testing with scheduler: fifos ~quota:93 Fairness of 100 fibers performing at least 10000 yields: sd: 0.000026 -- ideally 0 mean: 1.000093 -- ideally 1 median: 1.000100 -- ideally 1 Testing `Picos schedulers'. This run has ID `C2PVPJVQ'. [OK] Trivial main returns 0 [OK] Scheduler completes main computation 0 [OK] Current 0 [OK] Cancel_after 0 basic. [OK] Cancel_after 1 long timeout. [OK] Operation on canceled fiber raises 0 [OK] Cross scheduler wakeup 0 [OK] Fatal exception terminates scheduler 0 Full test results in `~\src\_build\default\test\_build\_tests\C2PVPJVQ'. Test Successful in 67.952s. 8 tests run. (cd _build/default/bench && .\main.exe -brief "Picos Yield") Picos Yield: time per yield/10000 fibers: 1424.61 ns time per yield/1000 fibers: 730.19 ns time per yield/100 fibers: 666.33 ns time per yield/10 fibers: 453.22 ns time per yield/1 fiber: 588.97 ns yields over time/10000 fibers: 0.70 M/s yields over time/1000 fibers: 1.37 M/s yields over time/100 fibers: 1.50 M/s yields over time/10 fibers: 2.21 M/s yields over time/1 fiber: 1.70 M/s (cd _build/default/bench && .\main.exe -brief "Picos Cancel_after with Picos_select") Picos Cancel_after with Picos_select: async round-trips over time/1 worker: 0.02 M/s async round-trips over time/2 workers: 0.03 M/s async round-trips over time/4 workers: 0.02 M/s round-trips over time/1 worker: 0.00 M/s round-trips over time/2 workers: 0.00 M/s round-trips over time/4 workers: 0.00 M/s time per async round-trip/1 worker: 41896.67 ns time per async round-trip/2 workers: 67056.11 ns time per async round-trip/4 workers: 257989.14 ns time per round-trip/1 worker: 6870584.20 ns time per round-trip/2 workers: 6971972.50 ns time per round-trip/4 workers: 8058536.40 ns (cd _build/default/bench && .\main.exe -brief "Ref with Picos_std_sync") Ref with Picos_std_sync: ops over time/cas int with Lock: 16.57 M/s ops over time/cas int with Rwlock: 8.43 M/s ops over time/cas int with Sem: 11.22 M/s ops over time/get with Lock: 15.45 M/s ops over time/get with Rwlock: 25.63 M/s ops over time/get with Sem: 24.96 M/s ops over time/incr with Lock: 14.63 M/s ops over time/incr with Rwlock: 13.39 M/s ops over time/incr with Sem: 24.05 M/s ops over time/push & pop with Lock: 6.10 M/s ops over time/push & pop with Rwlock: 6.05 M/s ops over time/push & pop with Sem: 10.81 M/s ops over time/swap with Lock: 10.75 M/s ops over time/swap with Rwlock: 9.45 M/s ops over time/swap with Sem: 10.43 M/s ops over time/xchg int with Lock: 12.99 M/s ops over time/xchg int with Rwlock: 10.50 M/s ops over time/xchg int with Sem: 14.49 M/s time per op/cas int with Lock: 60.33 ns time per op/cas int with Rwlock: 118.61 ns time per op/cas int with Sem: 89.10 ns time per op/get with Lock: 64.73 ns time per op/get with Rwlock: 39.02 ns time per op/get with Sem: 40.07 ns time per op/incr with Lock: 68.35 ns time per op/incr with Rwlock: 74.68 ns time per op/incr with Sem: 41.58 ns time per op/push & pop with Lock: 164.01 ns time per op/push & pop with Rwlock: 165.41 ns time per op/push & pop with Sem: 92.52 ns time per op/swap with Lock: 93.00 ns time per op/swap with Rwlock: 105.87 ns time per op/swap with Sem: 95.86 ns time per op/xchg int with Lock: 76.99 ns time per op/xchg int with Rwlock: 95.21 ns time per op/xchg int with Sem: 69.00 ns (cd _build/default/bench && .\main.exe -brief Picos_mpmcq) Picos_mpmcq: messages over time/1 nb adder, 1 nb taker: 3.87 M/s messages over time/1 nb adder, 2 nb takers: 6.08 M/s messages over time/1 nb adder, 4 nb takers: 6.19 M/s messages over time/2 nb adders, 1 nb taker: 10.64 M/s messages over time/2 nb adders, 2 nb takers: 7.80 M/s messages over time/2 nb adders, 4 nb takers: 8.42 M/s messages over time/4 nb adders, 1 nb taker: 7.50 M/s messages over time/4 nb adders, 2 nb takers: 3.83 M/s messages over time/4 nb adders, 4 nb takers: 4.62 M/s messages over time/one domain: 6.64 M/s time per message/1 nb adder, 1 nb taker: 517.32 ns time per message/1 nb adder, 2 nb takers: 493.25 ns time per message/1 nb adder, 4 nb takers: 807.95 ns time per message/2 nb adders, 1 nb taker: 281.84 ns time per message/2 nb adders, 2 nb takers: 512.96 ns time per message/2 nb adders, 4 nb takers: 712.21 ns time per message/4 nb adders, 1 nb taker: 666.81 ns time per message/4 nb adders, 2 nb takers: 1564.56 ns time per message/4 nb adders, 4 nb takers: 1733.33 ns time per message/one domain: 150.66 ns (cd _build/default/bench && .\main.exe -brief Picos_mpscq) Picos_mpscq: messages over time/1 nb adder, 1 nb taker: 4.16 M/s messages over time/2 nb adders, 1 nb taker: 6.43 M/s messages over time/4 nb adders, 1 nb taker: 5.96 M/s messages over time/one domain: 10.10 M/s time per message/1 nb adder, 1 nb taker: 480.22 ns time per message/2 nb adders, 1 nb taker: 466.75 ns time per message/4 nb adders, 1 nb taker: 838.39 ns time per message/one domain: 98.97 ns (cd _build/default/bench && .\main.exe -brief Picos_htbl) Picos_htbl: operations over time/1 worker, 10% reads: 7.48 M/s operations over time/1 worker, 50% reads: 9.45 M/s operations over time/1 worker, 90% reads: 14.89 M/s operations over time/2 workers, 10% reads: 14.53 M/s operations over time/2 workers, 50% reads: 16.71 M/s operations over time/2 workers, 90% reads: 17.62 M/s operations over time/4 workers, 10% reads: 13.18 M/s operations over time/4 workers, 50% reads: 22.86 M/s operations over time/4 workers, 90% reads: 34.99 M/s operations over time/8 workers, 10% reads: 21.67 M/s operations over time/8 workers, 50% reads: 36.02 M/s operations over time/8 workers, 90% reads: 55.74 M/s time per operation/1 worker, 10% reads: 133.74 ns time per operation/1 worker, 50% reads: 105.78 ns time per operation/1 worker, 90% reads: 67.16 ns time per operation/2 workers, 10% reads: 137.68 ns time per operation/2 workers, 50% reads: 119.72 ns time per operation/2 workers, 90% reads: 113.54 ns time per operation/4 workers, 10% reads: 303.49 ns time per operation/4 workers, 50% reads: 174.97 ns time per operation/4 workers, 90% reads: 114.32 ns time per operation/8 workers, 10% reads: 369.16 ns time per operation/8 workers, 50% reads: 222.13 ns time per operation/8 workers, 90% reads: 143.53 ns (cd _build/default/bench && .\main.exe -brief "Hashtbl with Picos_std_sync") Hashtbl with Picos_std_sync: operations over time/1 worker, 10% reads with Lock: 3.46 M/s operations over time/1 worker, 10% reads with Rwlock: 6.26 M/s operations over time/1 worker, 10% reads with Sem: 3.48 M/s operations over time/1 worker, 100% reads with Lock: 9.33 M/s operations over time/1 worker, 100% reads with Rwlock: 4.17 M/s operations over time/1 worker, 100% reads with Sem: 9.61 M/s operations over time/1 worker, 50% reads with Lock: 3.84 M/s operations over time/1 worker, 50% reads with Rwlock: 5.58 M/s operations over time/1 worker, 50% reads with Sem: 7.42 M/s operations over time/1 worker, 90% reads with Lock: 3.35 M/s operations over time/1 worker, 90% reads with Rwlock: 6.05 M/s operations over time/1 worker, 90% reads with Sem: 8.70 M/s operations over time/1 worker, 95% reads with Lock: 8.77 M/s operations over time/1 worker, 95% reads with Rwlock: 8.92 M/s operations over time/1 worker, 95% reads with Sem: 7.77 M/s operations over time/2 workers, 10% reads with Lock: 4.79 M/s operations over time/2 workers, 10% reads with Rwlock: 5.65 M/s operations over time/2 workers, 10% reads with Sem: 5.88 M/s operations over time/2 workers, 100% reads with Lock: 9.24 M/s operations over time/2 workers, 100% reads with Rwlock: 9.93 M/s operations over time/2 workers, 100% reads with Sem: 6.03 M/s operations over time/2 workers, 50% reads with Lock: 8.32 M/s operations over time/2 workers, 50% reads with Rwlock: 4.63 M/s operations over time/2 workers, 50% reads with Sem: 6.44 M/s operations over time/2 workers, 90% reads with Lock: 8.54 M/s operations over time/2 workers, 90% reads with Rwlock: 6.50 M/s operations over time/2 workers, 90% reads with Sem: 6.21 M/s operations over time/2 workers, 95% reads with Lock: 5.96 M/s operations over time/2 workers, 95% reads with Rwlock: 7.25 M/s operations over time/2 workers, 95% reads with Sem: 6.13 M/s operations over time/4 workers, 10% reads with Lock: 5.26 M/s operations over time/4 workers, 10% reads with Rwlock: 5.59 M/s operations over time/4 workers, 10% reads with Sem: 6.62 M/s operations over time/4 workers, 100% reads with Lock: 6.80 M/s operations over time/4 workers, 100% reads with Rwlock: 10.32 M/s operations over time/4 workers, 100% reads with Sem: 7.27 M/s operations over time/4 workers, 50% reads with Lock: 5.97 M/s operations over time/4 workers, 50% reads with Rwlock: 6.03 M/s operations over time/4 workers, 50% reads with Sem: 6.21 M/s operations over time/4 workers, 90% reads with Lock: 5.50 M/s operations over time/4 workers, 90% reads with Rwlock: 6.84 M/s operations over time/4 workers, 90% reads with Sem: 6.35 M/s operations over time/4 workers, 95% reads with Lock: 6.34 M/s operations over time/4 workers, 95% reads with Rwlock: 7.20 M/s operations over time/4 workers, 95% reads with Sem: 6.82 M/s operations over time/8 workers, 10% reads with Lock: 3.92 M/s operations over time/8 workers, 10% reads with Rwlock: 5.12 M/s operations over time/8 workers, 10% reads with Sem: 2.84 M/s operations over time/8 workers, 100% reads with Lock: 5.37 M/s operations over time/8 workers, 100% reads with Rwlock: 13.61 M/s operations over time/8 workers, 100% reads with Sem: 7.98 M/s operations over time/8 workers, 50% reads with Lock: 5.02 M/s operations over time/8 workers, 50% reads with Rwlock: 4.04 M/s operations over time/8 workers, 50% reads with Sem: 5.55 M/s operations over time/8 workers, 90% reads with Lock: 5.32 M/s operations over time/8 workers, 90% reads with Rwlock: 6.17 M/s operations over time/8 workers, 90% reads with Sem: 6.08 M/s operations over time/8 workers, 95% reads with Lock: 5.22 M/s operations over time/8 workers, 95% reads with Rwlock: 4.33 M/s operations over time/8 workers, 95% reads with Sem: 5.97 M/s time per operation/1 worker, 10% reads with Lock: 289.33 ns time per operation/1 worker, 10% reads with Rwlock: 159.72 ns time per operation/1 worker, 10% reads with Sem: 287.19 ns time per operation/1 worker, 100% reads with Lock: 107.16 ns time per operation/1 worker, 100% reads with Rwlock: 239.72 ns time per operation/1 worker, 100% reads with Sem: 104.02 ns time per operation/1 worker, 50% reads with Lock: 260.63 ns time per operation/1 worker, 50% reads with Rwlock: 179.19 ns time per operation/1 worker, 50% reads with Sem: 134.71 ns time per operation/1 worker, 90% reads with Lock: 298.11 ns time per operation/1 worker, 90% reads with Rwlock: 165.16 ns time per operation/1 worker, 90% reads with Sem: 114.97 ns time per operation/1 worker, 95% reads with Lock: 114.00 ns time per operation/1 worker, 95% reads with Rwlock: 112.08 ns time per operation/1 worker, 95% reads with Sem: 128.62 ns time per operation/2 workers, 10% reads with Lock: 417.92 ns time per operation/2 workers, 10% reads with Rwlock: 354.08 ns time per operation/2 workers, 10% reads with Sem: 340.21 ns time per operation/2 workers, 100% reads with Lock: 216.45 ns time per operation/2 workers, 100% reads with Rwlock: 201.36 ns time per operation/2 workers, 100% reads with Sem: 331.87 ns time per operation/2 workers, 50% reads with Lock: 240.34 ns time per operation/2 workers, 50% reads with Rwlock: 432.21 ns time per operation/2 workers, 50% reads with Sem: 310.48 ns time per operation/2 workers, 90% reads with Lock: 234.06 ns time per operation/2 workers, 90% reads with Rwlock: 307.67 ns time per operation/2 workers, 90% reads with Sem: 322.31 ns time per operation/2 workers, 95% reads with Lock: 335.80 ns time per operation/2 workers, 95% reads with Rwlock: 275.87 ns time per operation/2 workers, 95% reads with Sem: 326.53 ns time per operation/4 workers, 10% reads with Lock: 760.22 ns time per operation/4 workers, 10% reads with Rwlock: 715.29 ns time per operation/4 workers, 10% reads with Sem: 604.34 ns time per operation/4 workers, 100% reads with Lock: 588.49 ns time per operation/4 workers, 100% reads with Rwlock: 387.76 ns time per operation/4 workers, 100% reads with Sem: 550.22 ns time per operation/4 workers, 50% reads with Lock: 669.66 ns time per operation/4 workers, 50% reads with Rwlock: 662.81 ns time per operation/4 workers, 50% reads with Sem: 644.23 ns time per operation/4 workers, 90% reads with Lock: 726.78 ns time per operation/4 workers, 90% reads with Rwlock: 584.42 ns time per operation/4 workers, 90% reads with Sem: 629.63 ns time per operation/4 workers, 95% reads with Lock: 630.99 ns time per operation/4 workers, 95% reads with Rwlock: 555.46 ns time per operation/4 workers, 95% reads with Sem: 586.24 ns time per operation/8 workers, 10% reads with Lock: 2039.86 ns time per operation/8 workers, 10% reads with Rwlock: 1564.01 ns time per operation/8 workers, 10% reads with Sem: 2815.42 ns time per operation/8 workers, 100% reads with Lock: 1490.87 ns time per operation/8 workers, 100% reads with Rwlock: 587.61 ns time per operation/8 workers, 100% reads with Sem: 1001.95 ns time per operation/8 workers, 50% reads with Lock: 1594.33 ns time per operation/8 workers, 50% reads with Rwlock: 1981.38 ns time per operation/8 workers, 50% reads with Sem: 1441.49 ns time per operation/8 workers, 90% reads with Lock: 1505.16 ns time per operation/8 workers, 90% reads with Rwlock: 1297.20 ns time per operation/8 workers, 90% reads with Sem: 1315.24 ns time per operation/8 workers, 95% reads with Lock: 1532.73 ns time per operation/8 workers, 95% reads with Rwlock: 1847.27 ns time per operation/8 workers, 95% reads with Sem: 1340.26 ns (cd _build/default/bench && .\main.exe -brief Picos_stdio) Picos_stdio: (cd _build/default/bench && .\main.exe -brief "Picos_sync Stream") Picos_sync Stream: messages over time/1 nb pusher, 1 nb reader: 3.02 M/s messages over time/2 nb pushers, 1 nb reader: 2.74 M/s messages over time/4 nb pushers, 1 nb reader: 2.61 M/s messages over time/one domain: 2.23 M/s time per message/1 nb pusher, 1 nb reader: 662.94 ns time per message/2 nb pushers, 1 nb reader: 1094.72 ns time per message/4 nb pushers, 1 nb reader: 1916.47 ns time per message/one domain: 448.37 ns (cd _build/default/bench && .\main.exe -brief Fib) Fib: spawns over time/1 mfifo, fib 20: 0.17 M/s spawns over time/2 mfifos, fib 20: 0.57 M/s spawns over time/4 mfifos, fib 20: 0.32 M/s spawns over time/8 mfifos, fib 20: 0.65 M/s time per spawn/1 mfifo, fib 20: 5899.80 ns time per spawn/2 mfifos, fib 20: 3535.26 ns time per spawn/4 mfifos, fib 20: 12310.81 ns time per spawn/8 mfifos, fib 20: 12402.02 ns (cd _build/default/bench && .\main.exe -brief "Picos binaries") Picos binaries: binary size/picos: 62.47 kB binary size/picos.domain: 2.51 kB binary size/picos.thread: 2.11 kB binary size/picos_aux.htbl: 40.76 kB binary size/picos_aux.mpmcq: 11.13 kB binary size/picos_aux.mpscq: 13.75 kB binary size/picos_aux.rc: 12.06 kB binary size/picos_io: 76.09 kB binary size/picos_io.fd: 6.99 kB binary size/picos_io.select: 45.95 kB binary size/picos_io_cohttp: 35.91 kB binary size/picos_lwt: 18.78 kB binary size/picos_lwt.unix: 11.04 kB binary size/picos_mux.fifo: 18.57 kB binary size/picos_mux.multififo: 45.71 kB binary size/picos_mux.random: 35.37 kB binary size/picos_mux.thread: 15.89 kB binary size/picos_std.awaitable: 24.89 kB binary size/picos_std.event: 15.68 kB binary size/picos_std.finally: 14.02 kB binary size/picos_std.structured: 63.00 kB binary size/picos_std.sync: 136.63 kB (cd _build/default/bench && .\main.exe -brief "Bounded_q with Picos_std_sync") Bounded_q with Picos_std_sync: messages over time/1 adder, 1 taker with Lock: 5.02 M/s messages over time/1 adder, 2 takers with Lock: 2.48 M/s messages over time/1 adder, 4 takers with Lock: 0.62 M/s messages over time/2 adders, 1 taker with Lock: 3.29 M/s messages over time/2 adders, 2 takers with Lock: 2.68 M/s messages over time/2 adders, 4 takers with Lock: 3.11 M/s messages over time/4 adders, 1 taker with Lock: 3.52 M/s messages over time/4 adders, 2 takers with Lock: 1.52 M/s messages over time/4 adders, 4 takers with Lock: 1.93 M/s messages over time/one domain with Lock: 6.08 M/s time per message/1 adder, 1 taker with Lock: 398.26 ns time per message/1 adder, 2 takers with Lock: 1210.66 ns time per message/1 adder, 4 takers with Lock: 8079.53 ns time per message/2 adders, 1 taker with Lock: 912.53 ns time per message/2 adders, 2 takers with Lock: 1491.53 ns time per message/2 adders, 4 takers with Lock: 1932.15 ns time per message/4 adders, 1 taker with Lock: 1421.19 ns time per message/4 adders, 2 takers with Lock: 3935.14 ns time per message/4 adders, 4 takers with Lock: 4144.95 ns time per message/one domain with Lock: 164.45 ns (cd _build/default/bench && .\main.exe -brief "Memory usage") Memory usage: stack and heap used/Fun.protect: 80.00 B stack and heap used/fiber in a bundle: 240.00 B stack and heap used/fiber in a flock: 256.00 B stack and heap used/fiber with shared computation & latch: 240.00 B stack and heap used/finally: 40.00 B stack and heap used/instantiate: 96.00 B stack and heap used/join_after bundle: 248.00 B stack and heap used/join_after flock: 248.00 B stack and heap used/lastly: 32.00 B stack and heap used/promise in a bundle: 360.00 B stack and heap used/promise in a flock: 376.00 B 2025-08-04 11:28.54 ---> saved as "9eb5e65ac33e86c0512eb79f89099823f309e57cbe322051e145c9e263dc3e0e" Job succeeded 2025-08-04 11:28.54: Job succeeded