2025-05-22 20:00.59: New job: test ahrefs/ocannl https://github.com/ahrefs/ocannl.git#refs/heads/master (39741884b740497ac10065d5e464e6c70f9151f4) (linux-x86_64:alpine-3.21-5.3_opam-2.3) Base: ocaml/opam:alpine-3.21-ocaml-5.3@sha256:5957ac9a5e2e15455a45c19299b956410a2cb1aa7a7c57821dc4032ffd942571 Opam project build To reproduce locally: git clone --recursive "https://github.com/ahrefs/ocannl.git" -b "master" && cd "ocannl" && git reset --hard 39741884 cat > Dockerfile <<'END-OF-DOCKERFILE' FROM ocaml/opam:alpine-3.21-ocaml-5.3@sha256:5957ac9a5e2e15455a45c19299b956410a2cb1aa7a7c57821dc4032ffd942571 # alpine-3.21-5.3_opam-2.3 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" WORKDIR /src RUN sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version WORKDIR /src RUN sudo chown opam /src RUN cd ~/opam-repository && (git cat-file -e 2df846cb67d6f96ae4fced111519ff4ae27d19ae || git fetch origin master) && git reset -q --hard 2df846cb67d6f96ae4fced111519ff4ae27d19ae && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 neural_nets_lib.opam arrayjit.opam ./ RUN opam pin add -yn neural_nets_lib.dev './' && \ opam pin add -yn arrayjit.dev './' RUN echo '(lang dune 3.0)' > './dune-project' ENV DEPS="angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.19.0 dune-configurator.3.19.0 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /src RUN opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-05-22 20:00.59: Using cache hint "ahrefs/ocannl-ocaml/opam:alpine-3.21-ocaml-5.3@sha256:5957ac9a5e2e15455a45c19299b956410a2cb1aa7a7c57821dc4032ffd942571-alpine-3.21-5.3_opam-2.3-63d0fa7caba437c680f3f62d33f451da" 2025-05-22 20:00.59: Using OBuilder spec: ((from ocaml/opam:alpine-3.21-ocaml-5.3@sha256:5957ac9a5e2e15455a45c19299b956410a2cb1aa7a7c57821dc4032ffd942571) (comment alpine-3.21-5.3_opam-2.3) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (workdir /src) (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (workdir /src) (run (shell "sudo chown opam /src")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 2df846cb67d6f96ae4fced111519ff4ae27d19ae || git fetch origin master) && git reset -q --hard 2df846cb67d6f96ae4fced111519ff4ae27d19ae && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) (env DEPS "angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.19.0 dune-configurator.3.19.0 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /src)) (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-05-22 20:00.59: Waiting for resource in pool OCluster 2025-05-22 20:00.59: Waiting for worker… 2025-05-22 20:01.13: Got resource from pool OCluster Building on toxis.caelum.ci.dev All commits already cached HEAD is now at 39741884 Untested: convert remaining uses of Format except where printing Sexp values (from ocaml/opam:alpine-3.21-ocaml-5.3@sha256:5957ac9a5e2e15455a45c19299b956410a2cb1aa7a7c57821dc4032ffd942571) 2025-05-22 20:01.19 ---> using "e49bbe0abcbff463a9108ac76ab9842b22d9952ceb038ff96c3d7ae7970ff962" from cache /: (comment alpine-3.21-5.3_opam-2.3) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (workdir /src) /src: (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) 2025-05-22 20:01.19 ---> using "64b7ced97fd2f4eb6b85dd4ff640c243d5a7cac2c249f19cfd789778c95e1d2f" from cache /src: (run (shell "opam init --reinit -ni")) Configuring from /home/opam/.opamrc and then from built-in defaults. Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. This development version of opam requires an update to the layout of /home/opam/.opam from version 2.0 to version 2.2, which can't be reverted. You may want to back it up before going further. Continue? [y/n] y [NOTE] The 'jobs' option was reset, its value was 255 and its new value will vary according to the current number of cores on your machine. You can restore the fixed value using: opam option jobs=255 --global Format upgrade done. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [default] Initialised 2025-05-22 20:01.19 ---> using "49db337dae0bac71dc6cec7c726897191bef158123b428b74e27d62f0a6b1936" from cache /src: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) Linux 5.15.0-134-generic The OCaml toplevel, version 5.3.0 2.3.0 2025-05-22 20:01.19 ---> using "4c4b8b86a744581f3e1bd14a680e0d8cd636a7b643c1259f9d18c66b9e00b6ed" from cache /src: (workdir /src) /src: (run (shell "sudo chown opam /src")) 2025-05-22 20:01.19 ---> using "1d5c84a51aeb1e666c1d485b5bc48d792b94b09ea837c8e61ecb04ba67b721db" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 2df846cb67d6f96ae4fced111519ff4ae27d19ae || git fetch origin master) && git reset -q --hard 2df846cb67d6f96ae4fced111519ff4ae27d19ae && git log --no-decorate -n1 --oneline && opam update -u")) From https://github.com/ocaml/opam-repository * branch master -> FETCH_HEAD 35eb2f107a..2df846cb67 master -> origin/master 2df846cb67 Merge pull request #27910 from maiste/release-dune-3.19.0 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [default] synchronised from git+file:///home/opam/opam-repository Everything as up-to-date as possible (run with --verbose to show unavailable upgrades). However, you may "opam upgrade" these packages explicitly, which will ask permission to downgrade or uninstall the conflicting packages. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-05-22 20:01.19 ---> using "6e2b174ea98a37fbeacad981a7f9d4fbd9b4014cde55a3d7193902a5a56836ef" from cache /src: (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) 2025-05-22 20:01.21 ---> saved as "6eccd14c5988b8566700351c688cd56ee23159603f29dd0dd7c829bbd0e5738f" /src: (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) [neural_nets_lib.dev] synchronised (file:///src) neural_nets_lib is now pinned to file:///src (version dev) [arrayjit.dev] synchronised (file:///src) arrayjit is now pinned to file:///src (version dev) 2025-05-22 20:01.31 ---> saved as "668b7aa54b3ee589977a42f67efb6146fc19560535879305c54232fa0323817e" /src: (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) 2025-05-22 20:01.31 ---> saved as "8ed138ea618cbab6ea246b211732ae2e9c88e55ae160c3808d7bcb9d490453e6" /src: (env DEPS "angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.19.0 dune-configurator.3.19.0 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") /src: (env CI true) /src: (env OCAMLCI true) /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) + /usr/bin/sudo "apk" "update" - fetch https://dl-cdn.alpinelinux.org/alpine/v3.21/main/x86_64/APKINDEX.tar.gz - fetch https://dl-cdn.alpinelinux.org/alpine/v3.21/community/x86_64/APKINDEX.tar.gz - fetch https://dl-cdn.alpinelinux.org/alpine/edge/main/x86_64/APKINDEX.tar.gz - fetch https://dl-cdn.alpinelinux.org/alpine/edge/community/x86_64/APKINDEX.tar.gz - fetch https://dl-cdn.alpinelinux.org/alpine/edge/testing/x86_64/APKINDEX.tar.gz - v3.21.3-519-g661074ec05e [https://dl-cdn.alpinelinux.org/alpine/v3.21/main] - v3.21.3-510-g4f507628865 [https://dl-cdn.alpinelinux.org/alpine/v3.21/community] - v20250108-8756-g1ef56896610 [https://dl-cdn.alpinelinux.org/alpine/edge/main] - v20250108-8752-ge156553f96d [https://dl-cdn.alpinelinux.org/alpine/edge/community] - v20250108-8750-gf897a2821ec [https://dl-cdn.alpinelinux.org/alpine/edge/testing] - OK: 58275 distinct packages available <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [arrayjit.dev] synchronised (file:///src) [neural_nets_lib.dev] synchronised (file:///src) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following system packages will first need to be installed: libffi-dev <><> Handling external dependencies <><><><><><><><><><><><><><><><><><><><><><> + /usr/bin/sudo "apk" "add" "libffi-dev" - (1/2) Installing linux-headers (6.6-r1) - (2/2) Installing libffi-dev (3.4.7-r0) - OK: 312 MiB in 104 packages 2025-05-22 20:02.04 ---> saved as "0d70510df14a58151faf2cac667b7f88a0a1d5b52e1ddb2712aa07e179b57379" /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following actions will be performed: === install 75 packages - install angstrom 0.16.1 - install astring 0.8.5 - install backoff 0.1.1 - install base v0.17.2 - install bigarray-compat 1.1.0 - install bigstringaf 0.10.0 - install camlp-streams 5.0.1 - install cmdliner 1.3.0 - install conf-libffi 2.0.0 - install conf-pkg-config 4 - install cppo 1.8.0 - install csexp 1.5.2 - install ctypes 0.23.0 - install ctypes-foreign 0.23.0 - install dune 3.19.0 - install dune-configurator 3.19.0 - install fieldslib v0.17.0 - install fmt 0.10.0 - install integers 0.7.0 - install jane-street-headers v0.17.0 - install jst-config v0.17.0 - install logs 0.8.0 - install mdx 2.5.0 - install mtime 2.1.0 - install multicore-magic 2.3.1 - install num 1.5-1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml-version 4.0.0 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.16.1 - install ocamlfind 1.9.8 - install parsexp v0.17.0 - install pprint 20230830 - install ppx_assert v0.17.0 - install ppx_base v0.17.0 - install ppx_cold v0.17.0 - install ppx_compare v0.17.0 - install ppx_derivers 1.2.1 - install ppx_deriving 6.0.3 - install ppx_enumerate v0.17.0 - install ppx_expect v0.17.2 - install ppx_fields_conv v0.17.0 - install ppx_globalize v0.17.0 - install ppx_hash v0.17.0 - install ppx_here v0.17.0 - install ppx_inline_test v0.17.0 - install ppx_minidebug 2.2.0 - install ppx_optcomp v0.17.0 - install ppx_sexp_conv v0.17.0 - install ppx_string v0.17.0 - install ppx_variants_conv v0.17.0 - install ppxlib 0.35.0 - install ppxlib_jane v0.17.2 - install printbox 0.12 - install printbox-ext-plot 0.12 - install printbox-html 0.12 - install printbox-md 0.12 - install printbox-text 0.12 - install ptime 1.2.0 - install re 1.12.0 - install result 1.5 - install saturn_lockfree 0.5.0 - install seq base - install sexplib v0.17.0 - install sexplib0 v0.17.0 - install stdio v0.17.0 - install stdlib-shims 0.3.0 - install thread-local-storage 0.2 - install time_now v0.17.0 - install topkg 1.0.8 - install tyxml 4.6.0 - install uucp 16.0.0 - install uutf 1.0.4 - install variantslib v0.17.0 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved backoff.0.1.1 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved astring.0.8.5 (cached) -> retrieved base.v0.17.2 (cached) -> retrieved bigarray-compat.1.1.0 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved camlp-streams.5.0.1 (cached) -> retrieved cppo.1.8.0 (cached) -> retrieved cmdliner.1.3.0 (cached) -> installed conf-pkg-config.4 -> retrieved csexp.1.5.2 (cached) -> retrieved ctypes.0.23.0, ctypes-foreign.0.23.0 (cached) -> installed conf-libffi.2.0.0 -> retrieved fieldslib.v0.17.0 (cached) -> retrieved fmt.0.10.0 (cached) -> retrieved integers.0.7.0 (cached) -> retrieved jane-street-headers.v0.17.0 (cached) -> retrieved jst-config.v0.17.0 (cached) -> retrieved logs.0.8.0 (cached) -> retrieved mtime.2.1.0 (cached) -> retrieved mdx.2.5.0 (cached) -> retrieved multicore-magic.2.3.1 (cached) -> retrieved num.1.5-1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml-version.4.0.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ocamlbuild.0.16.1 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved parsexp.v0.17.0 (cached) -> retrieved pprint.20230830 (cached) -> retrieved ppx_assert.v0.17.0 (cached) -> retrieved ppx_base.v0.17.0 (cached) -> retrieved ppx_cold.v0.17.0 (cached) -> retrieved ppx_compare.v0.17.0 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved ppx_enumerate.v0.17.0 (cached) -> retrieved ppx_deriving.6.0.3 (cached) -> retrieved ppx_expect.v0.17.2 (cached) -> retrieved ppx_fields_conv.v0.17.0 (cached) -> retrieved ppx_globalize.v0.17.0 (cached) -> retrieved ppx_here.v0.17.0 (cached) -> retrieved ppx_hash.v0.17.0 (cached) -> retrieved ppx_inline_test.v0.17.0 (cached) -> retrieved ppx_optcomp.v0.17.0 (cached) -> retrieved ppx_sexp_conv.v0.17.0 (cached) -> retrieved ppx_string.v0.17.0 (cached) -> retrieved ppx_variants_conv.v0.17.0 (cached) -> retrieved ppx_minidebug.2.2.0 (cached) -> retrieved ppxlib_jane.v0.17.2 (cached) -> retrieved dune.3.19.0, dune-configurator.3.19.0 (cached) -> retrieved ppxlib.0.35.0 (cached) -> retrieved printbox.0.12, printbox-ext-plot.0.12, printbox-html.0.12, printbox-md.0.12, printbox-text.0.12 (cached) -> retrieved ptime.1.2.0 (cached) -> retrieved re.1.12.0 (cached) -> retrieved seq.base (cached) -> retrieved result.1.5 (cached) -> retrieved sexplib.v0.17.0 (cached) -> retrieved saturn_lockfree.0.5.0 (cached) -> installed cmdliner.1.3.0 -> installed num.1.5-1 -> installed seq.base -> retrieved sexplib0.v0.17.0 (cached) -> retrieved stdio.v0.17.0 (cached) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved thread-local-storage.0.2 (cached) -> retrieved time_now.v0.17.0 (cached) -> retrieved tyxml.4.6.0 (cached) -> retrieved topkg.1.0.8 (cached) -> retrieved uutf.1.0.4 (cached) -> retrieved variantslib.v0.17.0 (cached) -> retrieved uucp.16.0.0 (cached) -> installed ocamlbuild.0.16.1 -> installed ocamlfind.1.9.8 -> installed topkg.1.0.8 -> installed mtime.2.1.0 -> installed uutf.1.0.4 -> installed fmt.0.10.0 -> installed ptime.1.2.0 -> installed astring.0.8.5 -> installed logs.0.8.0 -> installed dune.3.19.0 -> installed jane-street-headers.v0.17.0 -> installed csexp.1.5.2 -> installed backoff.0.1.1 -> installed bigarray-compat.1.1.0 -> installed camlp-streams.5.0.1 -> installed multicore-magic.2.3.1 -> installed ocaml-version.4.0.0 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed pprint.20230830 -> installed ppx_derivers.1.2.1 -> installed printbox.0.12 -> installed result.1.5 -> installed sexplib0.v0.17.0 -> installed stdlib-shims.0.3.0 -> installed ocaml-syntax-shims.1.0.0 -> installed re.1.12.0 -> installed thread-local-storage.0.2 -> installed ocaml-compiler-libs.v0.17.0 -> installed cppo.1.8.0 -> installed saturn_lockfree.0.5.0 -> installed integers.0.7.0 -> installed parsexp.v0.17.0 -> installed dune-configurator.3.19.0 -> installed sexplib.v0.17.0 -> installed bigstringaf.0.10.0 -> installed mdx.2.5.0 -> installed angstrom.0.16.1 -> installed tyxml.4.6.0 -> installed printbox-html.0.12 -> installed uucp.16.0.0 -> installed printbox-text.0.12 -> installed ctypes.0.23.0 -> installed printbox-md.0.12 -> installed printbox-ext-plot.0.12 -> installed ctypes-foreign.0.23.0 -> installed base.v0.17.2 -> installed variantslib.v0.17.0 -> installed fieldslib.v0.17.0 -> installed stdio.v0.17.0 -> installed ppxlib.0.35.0 -> installed ppx_optcomp.v0.17.0 -> installed ppxlib_jane.v0.17.2 -> installed ppx_cold.v0.17.0 -> installed ppx_variants_conv.v0.17.0 -> installed ppx_here.v0.17.0 -> installed ppx_fields_conv.v0.17.0 -> installed ppx_enumerate.v0.17.0 -> installed ppx_globalize.v0.17.0 -> installed ppx_deriving.6.0.3 -> installed ppx_compare.v0.17.0 -> installed ppx_sexp_conv.v0.17.0 -> installed ppx_hash.v0.17.0 -> installed ppx_assert.v0.17.0 -> installed ppx_base.v0.17.0 -> installed ppx_minidebug.2.2.0 -> installed jst-config.v0.17.0 -> installed ppx_string.v0.17.0 -> installed time_now.v0.17.0 -> installed ppx_inline_test.v0.17.0 -> installed ppx_expect.v0.17.2 Done. # To update the current shell environment, run: eval $(opam env) 2025-05-22 20:05.04 ---> saved as "67bdc127d8e1eb40c4f5c8c4129efde69b0fb1523a43451c3b116f899a1e46d6" /src: (copy (src .) (dst /src)) 2025-05-22 20:05.05 ---> saved as "e0a5ae6849f131a6927c00a21a8adfda0149037b608d90c4a4a8998752095744" /src: (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test/config && ../../arrayjit/bin/read_config.exe --read=backend) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/config/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file Wrote value of 'backend' to ocannl_backend.txt (cd _build/default/test_ppx && ./test_ppx_op_expected.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test_ppx && ./test_ppx_op.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/af7b68b233c4ab153d1610ce11327f20/default/test/ocannl_config.' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Found 0, in the config file' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Retrieving commandline, environment, or config file variable ocannl_log_level' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition '' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file File "test/dune", lines 30-40, characters 0-281: 30 | (rule 31 | (alias runtest) 32 | (target 33 | (dir log_files)) 34 | (action 35 | (run 36 | %{dep:micrograd_demo_logging.exe} 37 | "--ocannl_debug_backend=text" 38 | "--ocannl_log_file_stem=micrograd_demo_logging" 39 | "--ocannl_log_main_domain_to_stdout=false" 40 | "--ocannl_debug_log_to_stream_files=true"))) (cd _build/default/test && ./micrograd_demo_logging.exe --ocannl_debug_backend=text --ocannl_log_file_stem=micrograd_demo_logging --ocannl_log_main_domain_to_stdout=false --ocannl_debug_log_to_stream_files=true) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file Retrieving commandline, environment, or config file variable ocannl_backend Found multicore_cc, in the config file Retrieving commandline, environment, or config file variable ocannl_cd_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_prefer_backend_uniformity Found true, in the config file Retrieving commandline, environment, or config file variable ocannl_debug_log_to_stream_files Found true, commandline --ocannl_debug_log_to_stream_files=true Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Fatal error: exception File "src/printbox-text/PrintBox_text.ml", line 212, characters 6-12: Assertion failed Raised at PrintBox_text.Output.Make_out.to_buf_aux_ in file "src/printbox-text/PrintBox_text.ml", line 212, characters 6-50 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 19-42 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from PrintBox_text.Output.Make_out.render in file "src/printbox-text/PrintBox_text.ml", line 242, characters 14-64 Called from PrintBox_text.output in file "src/printbox-text/PrintBox_text.ml", line 851, characters 2-31 Called from Minidebug_runtime.PrintBox.output_box in file "minidebug_runtime.ml", line 1527, characters 19-59 Called from Minidebug_runtime.PrintBox.close_log_impl.close_tree in file "minidebug_runtime.ml", line 1572, characters 6-38 Called from Backends.Add_buffer_retrieval_and_syncing.sync_routine in file "arrayjit/lib/backends.ml", lines 144-172, characters 31-82 Called from Backends.Raise_backend.link in file "arrayjit/lib/backends.ml", lines 454-455, characters 4-92 Re-raised at Backends.Raise_backend.link in file "arrayjit/lib/backends.ml", lines 441-455, characters 23-92 Called from Dune__exe__Micrograd_demo_logging in file "test/micrograd_demo_logging.ml", line 34, characters 13-77 (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition primitive_ops.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition zero2hero_1of7.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition hello_world_op.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition einsum_trivia.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition micrograd_demo.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition moons_demo_parallel.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file File "test/micrograd_demo.ml", line 1, characters 0-0: /usr/bin/git --no-pager diff --no-index --color=always -u _build/default/test/micrograd_demo.ml _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/micrograd_demo.ml.corrected diff --git a/_build/default/test/micrograd_demo.ml b/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/micrograd_demo.ml.corrected index 77e46c6..ab81526 100644 --- a/_build/default/test/micrograd_demo.ml +++ b/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/micrograd_demo.ml.corrected @@ -52,15 +52,14 @@ let%expect_test "Micrograd README basic example" = │├┼───────┤ │ │││ -4.00 │ │ │└┴───────┘ │ - └─────────────────┘ - ┌────────────────────────┐ - │[0]: a shape 0:1 grad_a│ - │┌┬─────────┐ │ - │││axis 0 │ │ - │├┼─────────┤ │ - │││ 1.38e+2 │ │ - │└┴─────────┘ │ - └────────────────────────┘ + └─────────────────┘┌────────────────────────┐ + │[0]: a shape 0:1 grad_a│ + │┌┬─────────┐ │ + │││axis 0 │ │ + │├┼─────────┤ │ + │││ 1.38e+2 │ │ + │└┴─────────┘ │ + └────────────────────────┘ |}]; Tensor.print ~with_code:false ~with_grad:true `Default b; [%expect @@ -72,15 +71,14 @@ let%expect_test "Micrograd README basic example" = │├┼──────┤ │ │││ 2.00 │ │ │└┴──────┘ │ - └─────────────────┘ - ┌────────────────────────┐ - │[2]: b shape 0:1 grad_b│ - │┌┬─────────┐ │ - │││axis 0 │ │ - │├┼─────────┤ │ - │││ 6.45e+2 │ │ - │└┴─────────┘ │ - └────────────────────────┘ + └─────────────────┘┌────────────────────────┐ + │[2]: b shape 0:1 grad_b│ + │┌┬─────────┐ │ + │││axis 0 │ │ + │├┼─────────┤ │ + │││ 6.45e+2 │ │ + │└┴─────────┘ │ + └────────────────────────┘ |}] let%expect_test "Micrograd half-moons example" = File "test/hello_world_op.ml", line 1, characters 0-0: /usr/bin/git --no-pager diff --no-index --color=always -u _build/default/test/hello_world_op.ml _build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/hello_world_op.ml.corrected diff --git a/_build/default/test/hello_world_op.ml b/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/hello_world_op.ml.corrected index ba9d7ef..6bfa309 100644 --- a/_build/default/test/hello_world_op.ml +++ b/_build/.sandbox/4012e687ae4244e1ad4353e92c261c63/default/test/hello_world_op.ml.corrected @@ -102,36 +102,46 @@ let%expect_test "Print constant tensor" = let%op hey = [ (1, 2, 3); (4, 5, 6) ] in Train.forward_and_forget backend ctx hey; Tensor.print ~with_code:false ~with_grad:false `Inline @@ hey; - [%expect {| [1.00, 2.00, 3.00; 4.00, 5.00, 6.00] |}]; + [%expect {| + [0]: [ 1.00 , 2.00 , 3.00 ; 4.00 , 5.00 , 6.00 ]_hey shape 1:3->0:2 [ + 1.00 , 2.00 , 3.00 + ; 4.00 , 5.00 , 6.00 + ] + |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hey; [%expect {| - ┌─────────────────────────────────────────────────────────────┐ - │[0]: [1.00, 2.00, 3.00; 4.00, 5.00, 6.00]_hey shape 1:3->0:2 │ - │┌──────┬──────────────────┐ │ - ││ │axis 1 │ │ - │├──────┼──────────────────┤ │ - ││axis 0│ 1.00 2.00 3.00 │ │ - ││ │ 4.00 5.00 6.00 │ │ - │└──────┴──────────────────┘ │ - └─────────────────────────────────────────────────────────────┘ + ┌────────────────────────────────────────────────────────────────────────┐ + │[0]: [ 1.00 , 2.00 , 3.00 ; 4.00 , 5.00 , 6.00 ]_hey shape 1:3->0:2 │ + │┌──────┬──────────────────┐ │ + ││ │axis 1 │ │ + │├──────┼──────────────────┤ │ + ││axis 0│ 1.00 2.00 3.00 │ │ + ││ │ 4.00 5.00 6.00 │ │ + │└──────┴──────────────────┘ │ + └────────────────────────────────────────────────────────────────────────┘ |}]; let%op hoo = [| [ 1; 2; 3 ]; [ 4; 5; 6 ] |] in Train.forward_and_forget backend ctx hoo; Tensor.print ~with_code:false ~with_grad:false `Inline @@ hoo; - [%expect {| [|[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]|] |}]; + [%expect {| + [1]: [| [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] |]_hoo shape 0:2|1:3 [| + [ 1.00 ; 2.00 ; 3.00 ] + ; [ 4.00 ; 5.00 ; 6.00 ] + |] + |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hoo; [%expect {| - ┌──────────────────────────────────────────────────────────────────┐ - │[1]: [|[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]|]_hoo shape 0:2|1:3 │ - │┌──────┬──────────────────┐ │ - ││ │axis 1 │ │ - │├──────┼──────────────────┤ │ - ││axis 0│ 1.00 2.00 3.00 │ │ - ││ │ 4.00 5.00 6.00 │ │ - │└──────┴──────────────────┘ │ - └──────────────────────────────────────────────────────────────────┘ + ┌─────────────────────────────────────────────────────────────────────────────┐ + │[1]: [| [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] |]_hoo shape 0:2|1:3 │ + │┌──────┬──────────────────┐ │ + ││ │axis 1 │ │ + │├──────┼──────────────────┤ │ + ││axis 0│ 1.00 2.00 3.00 │ │ + ││ │ 4.00 5.00 6.00 │ │ + │└──────┴──────────────────┘ │ + └─────────────────────────────────────────────────────────────────────────────┘ |}]; let%op hey2 = [ @@ -145,10 +155,12 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ hey2; [%expect {| - [(1.00, 2.00, 3.00), (4.00, 5.00, 6.00); - (7.00, 8.00, 9.00), (10.00, 11.00, 12.00); - (13.00, 14.00, 15.00), (16.00, 17.00, 18.00); - (19.00, 20.00, 21.00), (22.00, 23.00, 24.00)] + [2]: c4x2x3_hey2 shape 1:2,2:3->0:4 [ + ( 1.00 , 2.00 , 3.00 ) , ( 4.00 , 5.00 , 6.00 ) + ; ( 7.00 , 8.00 , 9.00 ) , ( 10.00 , 11.00 , 12.00 ) + ; ( 13.00 , 14.00 , 15.00 ) , ( 16.00 , 17.00 , 18.00 ) + ; ( 19.00 , 20.00 , 21.00 ) , ( 22.00 , 23.00 , 24.00 ) + ] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hey2; [%expect @@ -178,10 +190,12 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ hoo2; [%expect {| - [|[[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]]; - [[7.00; 8.00; 9.00]; [10.00; 11.00; 12.00]]; - [[13.00; 14.00; 15.00]; [16.00; 17.00; 18.00]]; - [[19.00; 20.00; 21.00]; [22.00; 23.00; 24.00]]|] + [3]: c4x2x3_hoo2 shape 0:4|1:2,2:3 [| + [ [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] ] + ; [ [ 7.00 ; 8.00 ; 9.00 ] ; [ 10.00 ; 11.00 ; 12.00 ] ] + ; [ [ 13.00 ; 14.00 ; 15.00 ] ; [ 16.00 ; 17.00 ; 18.00 ] ] + ; [ [ 19.00 ; 20.00 ; 21.00 ] ; [ 22.00 ; 23.00 ; 24.00 ] ] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hoo2; [%expect @@ -209,10 +223,12 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo; [%expect {| - [|[|[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]|]; - [|[7.00; 8.00; 9.00]; [10.00; 11.00; 12.00]|]; - [|[13.00; 14.00; 15.00]; [16.00; 17.00; 18.00]|]; - [|[19.00; 20.00; 21.00]; [22.00; 23.00; 24.00]|]|] + [4]: c4x2x3_heyhoo shape 0:4,1:2|2:3 [| + [| [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] |] + ; [| [ 7.00 ; 8.00 ; 9.00 ] ; [ 10.00 ; 11.00 ; 12.00 ] |] + ; [| [ 13.00 ; 14.00 ; 15.00 ] ; [ 16.00 ; 17.00 ; 18.00 ] |] + ; [| [ 19.00 ; 20.00 ; 21.00 ] ; [ 22.00 ; 23.00 ; 24.00 ] |] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo; [%expect @@ -240,15 +256,24 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo2; [%expect {| - [| - [|[[1.00; 31.00]; [2.00; 32.00]; [3.00; 33.00]]; - [[4.00; 34.00]; [5.00; 35.00]; [6.00; 36.00]]|]; - [|[[7.00; 37.00]; [8.00; 38.00]; [9.00; 39.00]]; - [[10.00; 40.00]; [11.00; 41.00]; [12.00; 42.00]]|]; - [|[[13.00; 43.00]; [14.00; 44.00]; [15.00; 45.00]]; - [[16.00; 46.00]; [17.00; 47.00]; [18.00; 48.00]]|]; - [|[[19.00; 49.00]; [20.00; 50.00]; [21.00; 51.00]]; - [[22.00; 52.00]; [23.00; 53.00]; [24.00; 54.00]]|]|] + [5]: c4x2x3x2_heyhoo2 shape 0:4,1:2|2:3,3:2 [| + [| + [ [ 1.00 ; 31.00 ] ; [ 2.00 ; 32.00 ] ; [ 3.00 ; 33.00 ] ] + ; [ [ 4.00 ; 34.00 ] ; [ 5.00 ; 35.00 ] ; [ 6.00 ; 36.00 ] ] + |] + ; [| + [ [ 7.00 ; 37.00 ] ; [ 8.00 ; 38.00 ] ; [ 9.00 ; 39.00 ] ] + ; [ [ 10.00 ; 40.00 ] ; [ 11.00 ; 41.00 ] ; [ 12.00 ; 42.00 ] ] + |] + ; [| + [ [ 13.00 ; 43.00 ] ; [ 14.00 ; 44.00 ] ; [ 15.00 ; 45.00 ] ] + ; [ [ 16.00 ; 46.00 ] ; [ 17.00 ; 47.00 ] ; [ 18.00 ; 48.00 ] ] + |] + ; [| + [ [ 19.00 ; 49.00 ] ; [ 20.00 ; 50.00 ] ; [ 21.00 ; 51.00 ] ] + ; [ [ 22.00 ; 52.00 ] ; [ 23.00 ; 53.00 ] ; [ 24.00 ; 54.00 ] ] + |] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo2; [%expect @@ -293,17 +318,28 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo3; [%expect {| - [| + [6]: c2x2x2x3x2_heyhoo3 shape 0:2,1:2|2:2,3:3,4:2 [| [| - [[[1.00; 31.00]; [2.00; 32.00]; [3.00; 33.00]]; - [[4.00; 34.00]; [5.00; 35.00]; [6.00; 36.00]]]; - [[[7.00; 37.00]; [8.00; 38.00]; [9.00; 39.00]]; - [[10.00; 40.00]; [11.00; 41.00]; [12.00; 42.00]]]|]; - [| - [[[13.00; 43.00]; [14.00; 44.00]; [15.00; 45.00]]; - [[16.00; 46.00]; [17.00; 47.00]; [18.00; 48.00]]]; - [[[19.00; 49.00]; [20.00; 50.00]; [21.00; 51.00]]; - [[22.00; 52.00]; [23.00; 53.00]; [24.00; 54.00]]]|]|] + [ + [ [ 1.00 ; 31.00 ] ; [ 2.00 ; 32.00 ] ; [ 3.00 ; 33.00 ] ] + ; [ [ 4.00 ; 34.00 ] ; [ 5.00 ; 35.00 ] ; [ 6.00 ; 36.00 ] ] + ] + ; [ + [ [ 7.00 ; 37.00 ] ; [ 8.00 ; 38.00 ] ; [ 9.00 ; 39.00 ] ] + ; [ [ 10.00 ; 40.00 ] ; [ 11.00 ; 41.00 ] ; [ 12.00 ; 42.00 ] ] + ] + |] + ; [| + [ + [ [ 13.00 ; 43.00 ] ; [ 14.00 ; 44.00 ] ; [ 15.00 ; 45.00 ] ] + ; [ [ 16.00 ; 46.00 ] ; [ 17.00 ; 47.00 ] ; [ 18.00 ; 48.00 ] ] + ] + ; [ + [ [ 19.00 ; 49.00 ] ; [ 20.00 ; 50.00 ] ; [ 21.00 ; 51.00 ] ] + ; [ [ 22.00 ; 52.00 ] ; [ 23.00 ; 53.00 ] ; [ 24.00 ; 54.00 ] ] + ] + |] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo3; [%expect @@ -353,17 +389,28 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo4; [%expect {| - [| - [ - [[1.00, 31.00; 2.00, 32.00; 3.00, 33.00]; - [4.00, 34.00; 5.00, 35.00; 6.00, 36.00]]; - [[7.00, 37.00; 8.00, 38.00; 9.00, 39.00]; - [10.00, 40.00; 11.00, 41.00; 12.00, 42.00]]]; + [7]: c2x2x2x3x2_heyhoo4 shape 0:2|4:2->1:2,2:2,3:3 [| [ - [[13.00, 43.00; 14.00, 44.00; 15.00, 45.00]; - [16.00, 46.00; 17.00, 47.00; 18.00, 48.00]]; - [[19.00, 49.00; 20.00, 50.00; 21.00, 51.00]; - [22.00, 52.00; 23.00, 53.00; 24.00, 54.00]]]|] + [ + [ 1.00 , 31.00 ; 2.00 , 32.00 ; 3.00 , 33.00 ] + ; [ 4.00 , 34.00 ; 5.00 , 35.00 ; 6.00 , 36.00 ] + ] + ; [ + [ 7.00 , 37.00 ; 8.00 , 38.00 ; 9.00 , 39.00 ] + ; [ 10.00 , 40.00 ; 11.00 , 41.00 ; 12.00 , 42.00 ] + ] + ] + ; [ + [ + [ 13.00 , 43.00 ; 14.00 , 44.00 ; 15.00 , 45.00 ] + ; [ 16.00 , 46.00 ; 17.00 , 47.00 ; 18.00 , 48.00 ] + ] + ; [ + [ 19.00 , 49.00 ; 20.00 , 50.00 ; 21.00 , 51.00 ] + ; [ 22.00 , 52.00 ; 23.00 , 53.00 ; 24.00 , 54.00 ] + ] + ] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo4; [%expect @@ -462,8 +509,29 @@ let%expect_test "Big matrix" = Tensor.print ~with_code:false ~with_grad:false `Inline zero_to_twenty; [%expect {| - [0.00; 1.00; 2.00; 3.00; 4.00; 5.00; 6.00; 7.00; 8.00; 9.00; 10.00; 11.00; - 12.00; 13.00; 14.00; 15.00; 16.00; 17.00; 18.00; 19.00; 20.00] + [2]: 0...20 shape 0:21 [ + 0.00 + ; 1.00 + ; 2.00 + ; 3.00 + ; 4.00 + ; 5.00 + ; 6.00 + ; 7.00 + ; 8.00 + ; 9.00 + ; 10.00 + ; 11.00 + ; 12.00 + ; 13.00 + ; 14.00 + ; 15.00 + ; 16.00 + ; 17.00 + ; 18.00 + ; 19.00 + ; 20.00 + ] |}]; Tensor.print ~with_code:false ~with_grad:false `Default zero_to_twenty; [%expect (cd _build/default/test && ./moons_demo_parallel_run.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file ("Set log_level to" 1) └─{orphaned from #2} Retrieving commandline, environment, or config file variable ocannl_backend Found multicore_cc, in the config file Properties of devices: (multicore_devices (device ((device_name CPU) (device_ordinal 0) (num_domains 72)))) @!Retrieving commandline, environment, or config file variable ocannl_prefer_backend_uniformity Found true, in the config file Retrieving commandline, environment, or config file variable ocannl_debug_log_to_stream_files Not found, using default false Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Retrieving commandline, environment, or config file variable ocannl_never_capture_stdout Not found, using default false Batch=59, step=60, lr=0.200000, batch loss=23.609453, epoch loss=23.609453 Batch=119, step=120, lr=0.199750, batch loss=8.539634, epoch loss=32.149087 Batch=179, step=180, lr=0.199250, batch loss=2.626295, epoch loss=34.775382 Batch=239, step=240, lr=0.199250, batch loss=0.851101, epoch loss=35.626483 Batch=299, step=300, lr=0.199000, batch loss=1.445322, epoch loss=37.071804 Batch=359, step=360, lr=0.198750, batch loss=1.330969, epoch loss=38.402774 Batch=419, step=420, lr=0.198500, batch loss=0.617836, epoch loss=39.020610 Batch=479, step=480, lr=0.198250, batch loss=0.823501, epoch loss=39.844111 Batch=539, step=540, lr=0.197750, batch loss=0.688277, epoch loss=40.532388 Batch=599, step=600, lr=0.197750, batch loss=1.064498, epoch loss=41.596886 Batch=659, step=660, lr=0.197250, batch loss=0.483272, epoch loss=42.080158 Batch=719, step=720, lr=0.197000, batch loss=0.411335, epoch loss=42.491494 Batch=779, step=780, lr=0.196750, batch loss=0.468737, epoch loss=42.960231 Batch=839, step=840, lr=0.196500, batch loss=0.443418, epoch loss=43.403648 Batch=899, step=900, lr=0.196250, batch loss=0.383531, epoch loss=43.787180 Batch=959, step=960, lr=0.196250, batch loss=0.243070, epoch loss=44.030249 Batch=1019, step=1020, lr=0.196000, batch loss=0.454756, epoch loss=44.485006 Batch=1079, step=1080, lr=0.195500, batch loss=0.254331, epoch loss=44.739336 Batch=1139, step=1140, lr=0.195250, batch loss=0.335699, epoch loss=45.075036 Batch=1199, step=1200, lr=0.195000, batch loss=0.262340, epoch loss=45.337376 Epoch=0, step=1200, lr=0.195000, epoch loss=45.337376 Batch=59, step=1260, lr=0.194750, batch loss=0.261167, epoch loss=0.261167 Batch=119, step=1320, lr=0.194750, batch loss=0.206791, epoch loss=0.467958 Batch=179, step=1380, lr=0.194250, batch loss=0.246753, epoch loss=0.714711 Batch=239, step=1440, lr=0.194000, batch loss=0.352118, epoch loss=1.066828 Batch=299, step=1500, lr=0.193750, batch loss=0.235811, epoch loss=1.302639 Batch=359, step=1560, lr=0.193750, batch loss=0.312474, epoch loss=1.615113 Batch=419, step=1620, lr=0.193500, batch loss=0.309053, epoch loss=1.924166 Batch=479, step=1680, lr=0.193000, batch loss=0.273529, epoch loss=2.197695 Batch=539, step=1740, lr=0.192750, batch loss=0.210928, epoch loss=2.408623 Batch=599, step=1800, lr=0.192500, batch loss=0.250511, epoch loss=2.659134 Batch=659, step=1860, lr=0.192500, batch loss=0.369600, epoch loss=3.028734 Batch=719, step=1920, lr=0.192250, batch loss=0.365466, epoch loss=3.394200 Batch=779, step=1980, lr=0.192000, batch loss=0.384015, epoch loss=3.778214 Batch=839, step=2040, lr=0.191750, batch loss=0.341834, epoch loss=4.120048 Batch=899, step=2100, lr=0.191500, batch loss=0.302638, epoch loss=4.422686 Batch=959, step=2160, lr=0.191000, batch loss=0.234229, epoch loss=4.656914 Batch=1019, step=2220, lr=0.190750, batch loss=0.355117, epoch loss=5.012031 Batch=1079, step=2280, lr=0.190500, batch loss=0.237143, epoch loss=5.249174 Batch=1139, step=2340, lr=0.190250, batch loss=0.281757, epoch loss=5.530931 Batch=1199, step=2400, lr=0.190000, batch loss=0.218572, epoch loss=5.749503 Epoch=1, step=2400, lr=0.190000, epoch loss=5.749503 Batch=59, step=2460, lr=0.189750, batch loss=0.230448, epoch loss=0.230448 Batch=119, step=2520, lr=0.189500, batch loss=0.191841, epoch loss=0.422289 Batch=179, step=2580, lr=0.189500, batch loss=0.221777, epoch loss=0.644067 Batch=239, step=2640, lr=0.189000, batch loss=0.336560, epoch loss=0.980627 Batch=299, step=2700, lr=0.189000, batch loss=0.212436, epoch loss=1.193062 Batch=359, step=2760, lr=0.188500, batch loss=0.295806, epoch loss=1.488869 Batch=419, step=2820, lr=0.188500, batch loss=0.283080, epoch loss=1.771949 Batch=479, step=2880, lr=0.188250, batch loss=0.261761, epoch loss=2.033710 Batch=539, step=2940, lr=0.188000, batch loss=0.195581, epoch loss=2.229291 Batch=599, step=3000, lr=0.187500, batch loss=0.233809, epoch loss=2.463100 Batch=659, step=3060, lr=0.187250, batch loss=0.340353, epoch loss=2.803453 Batch=719, step=3120, lr=0.187000, batch loss=0.337393, epoch loss=3.140846 Batch=779, step=3180, lr=0.186750, batch loss=0.358425, epoch loss=3.499271 Batch=839, step=3240, lr=0.186500, batch loss=0.329634, epoch loss=3.828904 Batch=899, step=3300, lr=0.186250, batch loss=0.298396, epoch loss=4.127300 Batch=959, step=3360, lr=0.186000, batch loss=0.236567, epoch loss=4.363867 Batch=1019, step=3420, lr=0.185750, batch loss=0.351073, epoch loss=4.714940 Batch=1079, step=3480, lr=0.185750, batch loss=0.221353, epoch loss=4.936293 Batch=1139, step=3540, lr=0.185500, batch loss=0.262828, epoch loss=5.199121 Batch=1199, step=3600, lr=0.185250, batch loss=0.199131, epoch loss=5.398252 Epoch=2, step=3600, lr=0.185250, epoch loss=5.398252 Batch=59, step=3660, lr=0.185000, batch loss=0.226600, epoch loss=0.226600 Batch=119, step=3720, lr=0.184750, batch loss=0.186116, epoch loss=0.412716 Batch=179, step=3780, lr=0.184250, batch loss=0.213243, epoch loss=0.625959 Batch=239, step=3840, lr=0.184000, batch loss=0.318006, epoch loss=0.943965 Batch=299, step=3900, lr=0.183750, batch loss=0.210296, epoch loss=1.154261 Batch=359, step=3960, lr=0.183750, batch loss=0.293948, epoch loss=1.448209 Batch=419, step=4020, lr=0.183500, batch loss=0.304253, epoch loss=1.752462 Batch=479, step=4080, lr=0.183250, batch loss=0.255681, epoch loss=2.008142 Batch=539, step=4140, lr=0.183000, batch loss=0.193839, epoch loss=2.201982 Batch=599, step=4200, lr=0.182750, batch loss=0.232851, epoch loss=2.434832 Batch=659, step=4260, lr=0.182250, batch loss=0.335695, epoch loss=2.770528 Batch=719, step=4320, lr=0.182250, batch loss=0.338684, epoch loss=3.109212 Batch=779, step=4380, lr=0.182000, batch loss=0.352045, epoch loss=3.461257 Batch=839, step=4440, lr=0.181500, batch loss=0.315028, epoch loss=3.776285 Batch=899, step=4500, lr=0.181500, batch loss=0.282990, epoch loss=4.059275 Batch=959, step=4560, lr=0.181250, batch loss=0.207255, epoch loss=4.266530 Batch=1019, step=4620, lr=0.181000, batch loss=0.300681, epoch loss=4.567211 Batch=1079, step=4680, lr=0.180750, batch loss=0.180070, epoch loss=4.747281 Batch=1139, step=4740, lr=0.180500, batch loss=0.224603, epoch loss=4.971884 Batch=1199, step=4800, lr=0.180250, batch loss=0.195766, epoch loss=5.167650 Epoch=3, step=4800, lr=0.180250, epoch loss=5.167650 Batch=59, step=4860, lr=0.179750, batch loss=0.232561, epoch loss=0.232561 Batch=119, step=4920, lr=0.179750, batch loss=0.194035, epoch loss=0.426596 Batch=179, step=4980, lr=0.179250, batch loss=0.205246, epoch loss=0.631842 Batch=239, step=5040, lr=0.179000, batch loss=0.307721, epoch loss=0.939562 Batch=299, step=5100, lr=0.178750, batch loss=0.205342, epoch loss=1.144904 Batch=359, step=5160, lr=0.178500, batch loss=0.270999, epoch loss=1.415903 Batch=419, step=5220, lr=0.178250, batch loss=0.265085, epoch loss=1.680988 Batch=479, step=5280, lr=0.178250, batch loss=0.240815, epoch loss=1.921803 Batch=539, step=5340, lr=0.177750, batch loss=0.189976, epoch loss=2.111779 Batch=599, step=5400, lr=0.177500, batch loss=0.233419, epoch loss=2.345198 Batch=659, step=5460, lr=0.177500, batch loss=0.322391, epoch loss=2.667589 Batch=719, step=5520, lr=0.177250, batch loss=0.324696, epoch loss=2.992286 Batch=779, step=5580, lr=0.177000, batch loss=0.337393, epoch loss=3.329679 Batch=839, step=5640, lr=0.176750, batch loss=0.307710, epoch loss=3.637389 Batch=899, step=5700, lr=0.176250, batch loss=0.272477, epoch loss=3.909865 Batch=959, step=5760, lr=0.176000, batch loss=0.212715, epoch loss=4.122580 Batch=1019, step=5820, lr=0.176000, batch loss=0.340636, epoch loss=4.463216 Batch=1079, step=5880, lr=0.175750, batch loss=0.203418, epoch loss=4.666634 Batch=1139, step=5940, lr=0.175250, batch loss=0.240425, epoch loss=4.907059 Batch=1199, step=6000, lr=0.175000, batch loss=0.188105, epoch loss=5.095164 Epoch=4, step=6000, lr=0.175000, epoch loss=5.095164 Batch=59, step=6060, lr=0.174750, batch loss=0.237156, epoch loss=0.237156 Batch=119, step=6120, lr=0.174500, batch loss=0.188429, epoch loss=0.425585 Batch=179, step=6180, lr=0.174250, batch loss=0.202146, epoch loss=0.627731 Batch=239, step=6240, lr=0.174000, batch loss=0.301324, epoch loss=0.929055 Batch=299, step=6300, lr=0.173750, batch loss=0.204116, epoch loss=1.133171 Batch=359, step=6360, lr=0.173750, batch loss=0.268377, epoch loss=1.401548 Batch=419, step=6420, lr=0.173250, batch loss=0.266146, epoch loss=1.667694 Batch=479, step=6480, lr=0.173000, batch loss=0.242688, epoch loss=1.910382 Batch=539, step=6540, lr=0.173000, batch loss=0.193740, epoch loss=2.104122 Batch=599, step=6600, lr=0.172750, batch loss=0.227763, epoch loss=2.331884 Batch=659, step=6660, lr=0.172250, batch loss=0.315084, epoch loss=2.646969 Batch=719, step=6720, lr=0.172000, batch loss=0.317031, epoch loss=2.964000 Batch=779, step=6780, lr=0.171750, batch loss=0.332391, epoch loss=3.296391 Batch=839, step=6840, lr=0.171750, batch loss=0.306196, epoch loss=3.602587 Batch=899, step=6900, lr=0.171500, batch loss=0.269982, epoch loss=3.872569 Batch=959, step=6960, lr=0.171250, batch loss=0.206637, epoch loss=4.079206 Batch=1019, step=7020, lr=0.171000, batch loss=0.333202, epoch loss=4.412408 Batch=1079, step=7080, lr=0.170750, batch loss=0.191105, epoch loss=4.603513 Batch=1139, step=7140, lr=0.170250, batch loss=0.226878, epoch loss=4.830391 Batch=1199, step=7200, lr=0.170000, batch loss=0.179737, epoch loss=5.010129 Epoch=5, step=7200, lr=0.170000, epoch loss=5.010129 Batch=59, step=7260, lr=0.169750, batch loss=0.218276, epoch loss=0.218276 Batch=119, step=7320, lr=0.169500, batch loss=0.179602, epoch loss=0.397878 Batch=179, step=7380, lr=0.169250, batch loss=0.195977, epoch loss=0.593854 Batch=239, step=7440, lr=0.169250, batch loss=0.293474, epoch loss=0.887329 Batch=299, step=7500, lr=0.168750, batch loss=0.206164, epoch loss=1.093492 Batch=359, step=7560, lr=0.168750, batch loss=0.261471, epoch loss=1.354963 Batch=419, step=7620, lr=0.168500, batch loss=0.253465, epoch loss=1.608428 Batch=479, step=7680, lr=0.168250, batch loss=0.232198, epoch loss=1.840627 Batch=539, step=7740, lr=0.168000, batch loss=0.184871, epoch loss=2.025498 Batch=599, step=7800, lr=0.167750, batch loss=0.218870, epoch loss=2.244367 Batch=659, step=7860, lr=0.167250, batch loss=0.306364, epoch loss=2.550732 Batch=719, step=7920, lr=0.167250, batch loss=0.308805, epoch loss=2.859537 Batch=779, step=7980, lr=0.167000, batch loss=0.329771, epoch loss=3.189308 Batch=839, step=8040, lr=0.166750, batch loss=0.291243, epoch loss=3.480551 Batch=899, step=8100, lr=0.166250, batch loss=0.261242, epoch loss=3.741793 Batch=959, step=8160, lr=0.166250, batch loss=0.195765, epoch loss=3.937558 Batch=1019, step=8220, lr=0.165750, batch loss=0.324081, epoch loss=4.261640 Batch=1079, step=8280, lr=0.165500, batch loss=0.193561, epoch loss=4.455201 Batch=1139, step=8340, lr=0.165250, batch loss=0.222027, epoch loss=4.677228 Batch=1199, step=8400, lr=0.165250, batch loss=0.175916, epoch loss=4.853144 Epoch=6, step=8400, lr=0.165250, epoch loss=4.853144 Batch=59, step=8460, lr=0.164750, batch loss=0.215569, epoch loss=0.215569 Batch=119, step=8520, lr=0.164750, batch loss=0.183150, epoch loss=0.398719 Batch=179, step=8580, lr=0.164250, batch loss=0.188826, epoch loss=0.587544 Batch=239, step=8640, lr=0.164250, batch loss=0.279519, epoch loss=0.867064 Batch=299, step=8700, lr=0.163750, batch loss=0.190263, epoch loss=1.057327 Batch=359, step=8760, lr=0.163750, batch loss=0.247016, epoch loss=1.304343 Batch=419, step=8820, lr=0.163500, batch loss=0.240949, epoch loss=1.545292 Batch=479, step=8880, lr=0.163250, batch loss=0.220240, epoch loss=1.765532 Batch=539, step=8940, lr=0.163000, batch loss=0.173801, epoch loss=1.939332 Batch=599, step=9000, lr=0.162500, batch loss=0.208618, epoch loss=2.147950 Batch=659, step=9060, lr=0.162250, batch loss=0.296579, epoch loss=2.444529 Batch=719, step=9120, lr=0.162250, batch loss=0.295604, epoch loss=2.740132 Batch=779, step=9180, lr=0.161750, batch loss=0.317620, epoch loss=3.057752 Batch=839, step=9240, lr=0.161750, batch loss=0.286981, epoch loss=3.344733 Batch=899, step=9300, lr=0.161250, batch loss=0.253315, epoch loss=3.598048 Batch=959, step=9360, lr=0.161250, batch loss=0.187245, epoch loss=3.785293 Batch=1019, step=9420, lr=0.161000, batch loss=0.302056, epoch loss=4.087349 Batch=1079, step=9480, lr=0.160750, batch loss=0.167141, epoch loss=4.254489 Batch=1139, step=9540, lr=0.160500, batch loss=0.199865, epoch loss=4.454355 Batch=1199, step=9600, lr=0.160250, batch loss=0.166063, epoch loss=4.620418 Epoch=7, step=9600, lr=0.160250, epoch loss=4.620418 Batch=59, step=9660, lr=0.160000, batch loss=0.189449, epoch loss=0.189449 Batch=119, step=9720, lr=0.159750, batch loss=0.159193, epoch loss=0.348643 Batch=179, step=9780, lr=0.159500, batch loss=0.178123, epoch loss=0.526766 Batch=239, step=9840, lr=0.159250, batch loss=0.262567, epoch loss=0.789333 Batch=299, step=9900, lr=0.159000, batch loss=0.184476, epoch loss=0.973809 Batch=359, step=9960, lr=0.158750, batch loss=0.235417, epoch loss=1.209226 Batch=419, step=10020, lr=0.158500, batch loss=0.232149, epoch loss=1.441375 Batch=479, step=10080, lr=0.158250, batch loss=0.214241, epoch loss=1.655616 Batch=539, step=10140, lr=0.158000, batch loss=0.172553, epoch loss=1.828169 Batch=599, step=10200, lr=0.157750, batch loss=0.202391, epoch loss=2.030561 Batch=659, step=10260, lr=0.157500, batch loss=0.282357, epoch loss=2.312918 Batch=719, step=10320, lr=0.157250, batch loss=0.286929, epoch loss=2.599846 Batch=779, step=10380, lr=0.157000, batch loss=0.296570, epoch loss=2.896416 Batch=839, step=10440, lr=0.156750, batch loss=0.267503, epoch loss=3.163919 Batch=899, step=10500, lr=0.156500, batch loss=0.238627, epoch loss=3.402546 Batch=959, step=10560, lr=0.156250, batch loss=0.178163, epoch loss=3.580709 Batch=1019, step=10620, lr=0.156000, batch loss=0.299383, epoch loss=3.880092 Batch=1079, step=10680, lr=0.155750, batch loss=0.176293, epoch loss=4.056385 Batch=1139, step=10740, lr=0.155500, batch loss=0.200751, epoch loss=4.257136 Batch=1199, step=10800, lr=0.155250, batch loss=0.156712, epoch loss=4.413848 Epoch=8, step=10800, lr=0.155250, epoch loss=4.413848 Batch=59, step=10860, lr=0.155000, batch loss=0.179090, epoch loss=0.179090 Batch=119, step=10920, lr=0.154500, batch loss=0.153597, epoch loss=0.332687 Batch=179, step=10980, lr=0.154250, batch loss=0.168265, epoch loss=0.500952 Batch=239, step=11040, lr=0.154000, batch loss=0.248162, epoch loss=0.749114 Batch=299, step=11100, lr=0.153750, batch loss=0.161601, epoch loss=0.910715 Batch=359, step=11160, lr=0.153500, batch loss=0.227415, epoch loss=1.138131 Batch=419, step=11220, lr=0.153250, batch loss=0.228030, epoch loss=1.366160 Batch=479, step=11280, lr=0.153250, batch loss=0.205364, epoch loss=1.571524 Batch=539, step=11340, lr=0.153000, batch loss=0.158485, epoch loss=1.730010 Batch=599, step=11400, lr=0.152500, batch loss=0.181766, epoch loss=1.911775 Batch=659, step=11460, lr=0.152500, batch loss=0.264378, epoch loss=2.176153 Batch=719, step=11520, lr=0.152000, batch loss=0.264826, epoch loss=2.440979 Batch=779, step=11580, lr=0.152000, batch loss=0.273047, epoch loss=2.714027 Batch=839, step=11640, lr=0.151500, batch loss=0.252935, epoch loss=2.966962 Batch=899, step=11700, lr=0.151250, batch loss=0.220271, epoch loss=3.187233 Batch=959, step=11760, lr=0.151250, batch loss=0.176182, epoch loss=3.363415 Batch=1019, step=11820, lr=0.151000, batch loss=0.259124, epoch loss=3.622539 Batch=1079, step=11880, lr=0.150500, batch loss=0.139161, epoch loss=3.761700 Batch=1139, step=11940, lr=0.150500, batch loss=0.167406, epoch loss=3.929106 Batch=1199, step=12000, lr=0.150000, batch loss=0.142187, epoch loss=4.071292 Epoch=9, step=12000, lr=0.150000, epoch loss=4.071292 Batch=59, step=12060, lr=0.149750, batch loss=0.181346, epoch loss=0.181346 Batch=119, step=12120, lr=0.149750, batch loss=0.147183, epoch loss=0.328530 Batch=179, step=12180, lr=0.149500, batch loss=0.152156, epoch loss=0.480686 Batch=239, step=12240, lr=0.149000, batch loss=0.221241, epoch loss=0.701926 Batch=299, step=12300, lr=0.148750, batch loss=0.144115, epoch loss=0.846041 Batch=359, step=12360, lr=0.148500, batch loss=0.197576, epoch loss=1.043617 Batch=419, step=12420, lr=0.148500, batch loss=0.209398, epoch loss=1.253015 Batch=479, step=12480, lr=0.148000, batch loss=0.180879, epoch loss=1.433894 Batch=539, step=12540, lr=0.148000, batch loss=0.145172, epoch loss=1.579066 Batch=599, step=12600, lr=0.147750, batch loss=0.151544, epoch loss=1.730610 Batch=659, step=12660, lr=0.147250, batch loss=0.228792, epoch loss=1.959402 Batch=719, step=12720, lr=0.147000, batch loss=0.235669, epoch loss=2.195071 Batch=779, step=12780, lr=0.146750, batch loss=0.258783, epoch loss=2.453854 Batch=839, step=12840, lr=0.146750, batch loss=0.226612, epoch loss=2.680466 Batch=899, step=12900, lr=0.146250, batch loss=0.191391, epoch loss=2.871857 Batch=959, step=12960, lr=0.146000, batch loss=0.164515, epoch loss=3.036372 Batch=1019, step=13020, lr=0.146000, batch loss=0.262312, epoch loss=3.298684 Batch=1079, step=13080, lr=0.145500, batch loss=0.120090, epoch loss=3.418774 Batch=1139, step=13140, lr=0.145500, batch loss=0.161262, epoch loss=3.580037 Batch=1199, step=13200, lr=0.145250, batch loss=0.121188, epoch loss=3.701225 Epoch=10, step=13200, lr=0.145250, epoch loss=3.701225 Batch=59, step=13260, lr=0.145000, batch loss=0.145089, epoch loss=0.145089 Batch=119, step=13320, lr=0.144500, batch loss=0.121761, epoch loss=0.266850 Batch=179, step=13380, lr=0.144500, batch loss=0.130705, epoch loss=0.397556 Batch=239, step=13440, lr=0.144000, batch loss=0.191752, epoch loss=0.589307 Batch=299, step=13500, lr=0.144000, batch loss=0.116251, epoch loss=0.705558 Batch=359, step=13560, lr=0.143750, batch loss=0.164485, epoch loss=0.870043 Batch=419, step=13620, lr=0.143500, batch loss=0.164330, epoch loss=1.034373 Batch=479, step=13680, lr=0.143250, batch loss=0.150351, epoch loss=1.184725 Batch=539, step=13740, lr=0.143000, batch loss=0.121434, epoch loss=1.306158 Batch=599, step=13800, lr=0.142750, batch loss=0.123433, epoch loss=1.429592 Batch=659, step=13860, lr=0.142500, batch loss=0.179600, epoch loss=1.609192 Batch=719, step=13920, lr=0.142000, batch loss=0.181847, epoch loss=1.791039 Batch=779, step=13980, lr=0.142000, batch loss=0.201279, epoch loss=1.992318 Batch=839, step=14040, lr=0.141750, batch loss=0.193412, epoch loss=2.185730 Batch=899, step=14100, lr=0.141500, batch loss=0.179192, epoch loss=2.364922 Batch=959, step=14160, lr=0.141250, batch loss=0.144275, epoch loss=2.509197 Batch=1019, step=14220, lr=0.141000, batch loss=0.295512, epoch loss=2.804709 Batch=1079, step=14280, lr=0.140750, batch loss=0.081281, epoch loss=2.885991 Batch=1139, step=14340, lr=0.140500, batch loss=0.125730, epoch loss=3.011721 Batch=1199, step=14400, lr=0.140250, batch loss=0.094595, epoch loss=3.106316 Epoch=11, step=14400, lr=0.140250, epoch loss=3.106316 Batch=59, step=14460, lr=0.140000, batch loss=0.114364, epoch loss=0.114364 Batch=119, step=14520, lr=0.139750, batch loss=0.106035, epoch loss=0.220399 Batch=179, step=14580, lr=0.139500, batch loss=0.106529, epoch loss=0.326928 Batch=239, step=14640, lr=0.139250, batch loss=0.147021, epoch loss=0.473948 Batch=299, step=14700, lr=0.139000, batch loss=0.083475, epoch loss=0.557423 Batch=359, step=14760, lr=0.138750, batch loss=0.122420, epoch loss=0.679843 Batch=419, step=14820, lr=0.138500, batch loss=0.127339, epoch loss=0.807182 Batch=479, step=14880, lr=0.138250, batch loss=0.111404, epoch loss=0.918586 Batch=539, step=14940, lr=0.138000, batch loss=0.104232, epoch loss=1.022818 Batch=599, step=15000, lr=0.137750, batch loss=0.088458, epoch loss=1.111276 Batch=659, step=15060, lr=0.137500, batch loss=0.132440, epoch loss=1.243716 Batch=719, step=15120, lr=0.137000, batch loss=0.129253, epoch loss=1.372969 Batch=779, step=15180, lr=0.136750, batch loss=0.131070, epoch loss=1.504039 Batch=839, step=15240, lr=0.136500, batch loss=0.156766, epoch loss=1.660805 Batch=899, step=15300, lr=0.136500, batch loss=0.238827, epoch loss=1.899632 Batch=959, step=15360, lr=0.136250, batch loss=0.053492, epoch loss=1.953124 Batch=1019, step=15420, lr=0.136000, batch loss=0.137327, epoch loss=2.090451 Batch=1079, step=15480, lr=0.135750, batch loss=0.056890, epoch loss=2.147341 Batch=1139, step=15540, lr=0.135500, batch loss=0.115882, epoch loss=2.263223 Batch=1199, step=15600, lr=0.135000, batch loss=0.059383, epoch loss=2.322606 Epoch=12, step=15600, lr=0.135000, epoch loss=2.322606 Batch=59, step=15660, lr=0.134750, batch loss=0.099582, epoch loss=0.099582 Batch=119, step=15720, lr=0.134500, batch loss=0.164334, epoch loss=0.263916 Batch=179, step=15780, lr=0.134500, batch loss=0.113456, epoch loss=0.377372 Batch=239, step=15840, lr=0.134250, batch loss=0.107506, epoch loss=0.484878 Batch=299, step=15900, lr=0.133750, batch loss=0.050915, epoch loss=0.535792 Batch=359, step=15960, lr=0.133500, batch loss=0.089415, epoch loss=0.625208 Batch=419, step=16020, lr=0.133500, batch loss=0.090135, epoch loss=0.715342 Batch=479, step=16080, lr=0.133250, batch loss=0.071018, epoch loss=0.786360 Batch=539, step=16140, lr=0.133000, batch loss=0.052198, epoch loss=0.838558 Batch=599, step=16200, lr=0.132750, batch loss=0.098367, epoch loss=0.936924 Batch=659, step=16260, lr=0.132500, batch loss=0.081039, epoch loss=1.017963 Batch=719, step=16320, lr=0.132250, batch loss=0.080277, epoch loss=1.098240 Batch=779, step=16380, lr=0.132000, batch loss=0.092177, epoch loss=1.190418 Batch=839, step=16440, lr=0.131750, batch loss=0.117892, epoch loss=1.308309 Batch=899, step=16500, lr=0.131500, batch loss=0.150586, epoch loss=1.458896 Batch=959, step=16560, lr=0.131250, batch loss=0.056878, epoch loss=1.515774 Batch=1019, step=16620, lr=0.131000, batch loss=0.150118, epoch loss=1.665892 Batch=1079, step=16680, lr=0.130750, batch loss=0.021928, epoch loss=1.687820 Batch=1139, step=16740, lr=0.130250, batch loss=0.044965, epoch loss=1.732784 Batch=1199, step=16800, lr=0.130250, batch loss=0.027503, epoch loss=1.760287 Epoch=13, step=16800, lr=0.130250, epoch loss=1.760287 Batch=59, step=16860, lr=0.130000, batch loss=0.051090, epoch loss=0.051090 Batch=119, step=16920, lr=0.129750, batch loss=0.101051, epoch loss=0.152141 Batch=179, step=16980, lr=0.129500, batch loss=0.069363, epoch loss=0.221504 Batch=239, step=17040, lr=0.129250, batch loss=0.062909, epoch loss=0.284412 Batch=299, step=17100, lr=0.129000, batch loss=0.019750, epoch loss=0.304162 Batch=359, step=17160, lr=0.128750, batch loss=0.048892, epoch loss=0.353054 Batch=419, step=17220, lr=0.128500, batch loss=0.053599, epoch loss=0.406653 Batch=479, step=17280, lr=0.128250, batch loss=0.033282, epoch loss=0.439935 Batch=539, step=17340, lr=0.128000, batch loss=0.053454, epoch loss=0.493388 Batch=599, step=17400, lr=0.127750, batch loss=0.035537, epoch loss=0.528926 Batch=659, step=17460, lr=0.127500, batch loss=0.050611, epoch loss=0.579536 Batch=719, step=17520, lr=0.127250, batch loss=0.040489, epoch loss=0.620025 Batch=779, step=17580, lr=0.127000, batch loss=0.040360, epoch loss=0.660385 Batch=839, step=17640, lr=0.126750, batch loss=0.052486, epoch loss=0.712871 Batch=899, step=17700, lr=0.126500, batch loss=0.046779, epoch loss=0.759650 Batch=959, step=17760, lr=0.126250, batch loss=0.017090, epoch loss=0.776740 Batch=1019, step=17820, lr=0.126000, batch loss=0.038249, epoch loss=0.814990 Batch=1079, step=17880, lr=0.125750, batch loss=0.072859, epoch loss=0.887849 Batch=1139, step=17940, lr=0.125500, batch loss=0.172519, epoch loss=1.060368 Batch=1199, step=18000, lr=0.125250, batch loss=0.034367, epoch loss=1.094736 Epoch=14, step=18000, lr=0.125250, epoch loss=1.094736 Batch=59, step=18060, lr=0.124750, batch loss=0.022506, epoch loss=0.022506 Batch=119, step=18120, lr=0.124750, batch loss=0.021408, epoch loss=0.043914 Batch=179, step=18180, lr=0.124500, batch loss=0.033216, epoch loss=0.077130 Batch=239, step=18240, lr=0.124250, batch loss=0.042148, epoch loss=0.119278 Batch=299, step=18300, lr=0.124000, batch loss=0.009655, epoch loss=0.128934 Batch=359, step=18360, lr=0.123750, batch loss=0.029066, epoch loss=0.157999 Batch=419, step=18420, lr=0.123500, batch loss=0.032023, epoch loss=0.190022 Batch=479, step=18480, lr=0.123250, batch loss=0.020063, epoch loss=0.210085 Batch=539, step=18540, lr=0.123000, batch loss=0.031291, epoch loss=0.241376 Batch=599, step=18600, lr=0.122500, batch loss=0.026414, epoch loss=0.267790 Batch=659, step=18660, lr=0.122500, batch loss=0.033566, epoch loss=0.301357 Batch=719, step=18720, lr=0.122000, batch loss=0.044620, epoch loss=0.345976 Batch=779, step=18780, lr=0.122000, batch loss=0.108548, epoch loss=0.454524 Batch=839, step=18840, lr=0.121500, batch loss=0.047247, epoch loss=0.501771 Batch=899, step=18900, lr=0.121500, batch loss=0.051775, epoch loss=0.553546 Batch=959, step=18960, lr=0.121250, batch loss=0.014690, epoch loss=0.568236 Batch=1019, step=19020, lr=0.121000, batch loss=0.023310, epoch loss=0.591546 Batch=1079, step=19080, lr=0.120750, batch loss=0.015075, epoch loss=0.606621 Batch=1139, step=19140, lr=0.120500, batch loss=0.031373, epoch loss=0.637994 Batch=1199, step=19200, lr=0.120250, batch loss=0.011729, epoch loss=0.649724 Epoch=15, step=19200, lr=0.120250, epoch loss=0.649724 Batch=59, step=19260, lr=0.120000, batch loss=0.011512, epoch loss=0.011512 Batch=119, step=19320, lr=0.119750, batch loss=0.027243, epoch loss=0.038755 Batch=179, step=19380, lr=0.119500, batch loss=0.086128, epoch loss=0.124883 Batch=239, step=19440, lr=0.119250, batch loss=0.035128, epoch loss=0.160011 Batch=299, step=19500, lr=0.119000, batch loss=0.008408, epoch loss=0.168418 Batch=359, step=19560, lr=0.118750, batch loss=0.026717, epoch loss=0.195135 Batch=419, step=19620, lr=0.118500, batch loss=0.020981, epoch loss=0.216117 Batch=479, step=19680, lr=0.118250, batch loss=0.007223, epoch loss=0.223339 Batch=539, step=19740, lr=0.117750, batch loss=0.024238, epoch loss=0.247577 Batch=599, step=19800, lr=0.117500, batch loss=0.026897, epoch loss=0.274474 Batch=659, step=19860, lr=0.117500, batch loss=0.019398, epoch loss=0.293872 Batch=719, step=19920, lr=0.117250, batch loss=0.044354, epoch loss=0.338225 Batch=779, step=19980, lr=0.116750, batch loss=0.078045, epoch loss=0.416270 Batch=839, step=20040, lr=0.116750, batch loss=0.031177, epoch loss=0.447448 Batch=899, step=20100, lr=0.116500, batch loss=0.033523, epoch loss=0.480971 Batch=959, step=20160, lr=0.116000, batch loss=0.011558, epoch loss=0.492529 Batch=1019, step=20220, lr=0.116000, batch loss=0.014450, epoch loss=0.506979 Batch=1079, step=20280, lr=0.115750, batch loss=0.002839, epoch loss=0.509818 Batch=1139, step=20340, lr=0.115500, batch loss=0.015320, epoch loss=0.525138 Batch=1199, step=20400, lr=0.115250, batch loss=0.006950, epoch loss=0.532087 Epoch=16, step=20400, lr=0.115250, epoch loss=0.532087 Batch=59, step=20460, lr=0.115000, batch loss=0.003115, epoch loss=0.003115 Batch=119, step=20520, lr=0.114750, batch loss=0.009269, epoch loss=0.012384 Batch=179, step=20580, lr=0.114500, batch loss=0.019957, epoch loss=0.032341 Batch=239, step=20640, lr=0.114250, batch loss=0.021325, epoch loss=0.053665 Batch=299, step=20700, lr=0.114000, batch loss=0.009785, epoch loss=0.063450 Batch=359, step=20760, lr=0.113500, batch loss=0.014953, epoch loss=0.078403 Batch=419, step=20820, lr=0.113500, batch loss=0.015677, epoch loss=0.094080 Batch=479, step=20880, lr=0.113250, batch loss=0.003452, epoch loss=0.097533 Batch=539, step=20940, lr=0.113000, batch loss=0.016812, epoch loss=0.114345 Batch=599, step=21000, lr=0.112750, batch loss=0.020818, epoch loss=0.135163 Batch=659, step=21060, lr=0.112500, batch loss=0.016691, epoch loss=0.151854 Batch=719, step=21120, lr=0.112250, batch loss=0.034359, epoch loss=0.186213 Batch=779, step=21180, lr=0.112000, batch loss=0.051046, epoch loss=0.237259 Batch=839, step=21240, lr=0.111750, batch loss=0.026267, epoch loss=0.263525 Batch=899, step=21300, lr=0.111500, batch loss=0.025467, epoch loss=0.288993 Batch=959, step=21360, lr=0.111250, batch loss=0.016396, epoch loss=0.305389 Batch=1019, step=21420, lr=0.111000, batch loss=0.013094, epoch loss=0.318483 Batch=1079, step=21480, lr=0.110750, batch loss=0.002616, epoch loss=0.321099 Batch=1139, step=21540, lr=0.110250, batch loss=0.012186, epoch loss=0.333286 Batch=1199, step=21600, lr=0.110250, batch loss=0.004834, epoch loss=0.338119 Epoch=17, step=21600, lr=0.110250, epoch loss=0.338119 Batch=59, step=21660, lr=0.110000, batch loss=0.002228, epoch loss=0.002228 Batch=119, step=21720, lr=0.109750, batch loss=0.007546, epoch loss=0.009773 Batch=179, step=21780, lr=0.109500, batch loss=0.012307, epoch loss=0.022080 Batch=239, step=21840, lr=0.109250, batch loss=0.009684, epoch loss=0.031764 Batch=299, step=21900, lr=0.109000, batch loss=0.006400, epoch loss=0.038164 Batch=359, step=21960, lr=0.108750, batch loss=0.017564, epoch loss=0.055728 Batch=419, step=22020, lr=0.108500, batch loss=0.012965, epoch loss=0.068693 Batch=479, step=22080, lr=0.108250, batch loss=0.002855, epoch loss=0.071548 Batch=539, step=22140, lr=0.108000, batch loss=0.018438, epoch loss=0.089986 Batch=599, step=22200, lr=0.107750, batch loss=0.016027, epoch loss=0.106013 Batch=659, step=22260, lr=0.107500, batch loss=0.011890, epoch loss=0.117903 Batch=719, step=22320, lr=0.107250, batch loss=0.016632, epoch loss=0.134535 Batch=779, step=22380, lr=0.107000, batch loss=0.022909, epoch loss=0.157445 Batch=839, step=22440, lr=0.106750, batch loss=0.025927, epoch loss=0.183372 Batch=899, step=22500, lr=0.106500, batch loss=0.026510, epoch loss=0.209883 Batch=959, step=22560, lr=0.106250, batch loss=0.009140, epoch loss=0.219023 Batch=1019, step=22620, lr=0.106000, batch loss=0.009014, epoch loss=0.228037 Batch=1079, step=22680, lr=0.105750, batch loss=0.002157, epoch loss=0.230194 Batch=1139, step=22740, lr=0.105500, batch loss=0.010786, epoch loss=0.240979 Batch=1199, step=22800, lr=0.105250, batch loss=0.004817, epoch loss=0.245797 Epoch=18, step=22800, lr=0.105250, epoch loss=0.245797 Batch=59, step=22860, lr=0.105000, batch loss=0.001928, epoch loss=0.001928 Batch=119, step=22920, lr=0.104750, batch loss=0.005695, epoch loss=0.007623 Batch=179, step=22980, lr=0.104500, batch loss=0.011050, epoch loss=0.018673 Batch=239, step=23040, lr=0.104250, batch loss=0.011731, epoch loss=0.030404 Batch=299, step=23100, lr=0.104000, batch loss=0.008423, epoch loss=0.038827 Batch=359, step=23160, lr=0.103750, batch loss=0.011268, epoch loss=0.050095 Batch=419, step=23220, lr=0.103500, batch loss=0.010770, epoch loss=0.060865 Batch=479, step=23280, lr=0.103250, batch loss=0.002488, epoch loss=0.063352 Batch=539, step=23340, lr=0.103000, batch loss=0.017247, epoch loss=0.080599 Batch=599, step=23400, lr=0.102750, batch loss=0.013853, epoch loss=0.094452 Batch=659, step=23460, lr=0.102500, batch loss=0.010492, epoch loss=0.104944 Batch=719, step=23520, lr=0.102000, batch loss=0.015388, epoch loss=0.120332 Batch=779, step=23580, lr=0.102000, batch loss=0.021853, epoch loss=0.142185 Batch=839, step=23640, lr=0.101750, batch loss=0.024745, epoch loss=0.166930 Batch=899, step=23700, lr=0.101500, batch loss=0.024053, epoch loss=0.190983 Batch=959, step=23760, lr=0.101250, batch loss=0.009007, epoch loss=0.199990 Batch=1019, step=23820, lr=0.101000, batch loss=0.007882, epoch loss=0.207872 Batch=1079, step=23880, lr=0.100750, batch loss=0.000717, epoch loss=0.208588 Batch=1139, step=23940, lr=0.100500, batch loss=0.009710, epoch loss=0.218299 Batch=1199, step=24000, lr=0.100250, batch loss=0.004802, epoch loss=0.223101 Epoch=19, step=24000, lr=0.100250, epoch loss=0.223101 Half-moons scatterplot and decision boundary: ┌────────────────────────────────────────────────────────────────────────────────────────────────────┐ │********************************#*******************************************************************│ │**********************#*#*#######*###*#####*********************************************************│ │**********************#########################*****************************************************│ │*****************#**########*######*###########*###*************************************************│ │***************#################*###################************************************************│ │************######*#################*#################**********************************************│ │**********#*#####*########*#**************##*#########*#********************************************│ │***********########*##*#******************#*****##########*****************************************.│ │***********###########*************************############*************************************....│ │********######*####*********************************###*###*#*********************************......│ │*******######**##*************....*****************#*######*#******************************.........│ │*******##*##**##**********...........***************########*##**************************...........│ │*****#######************.......%...%%...***************#########***********************...........%.│ │******######**********..........%.........**************##*#####**********************........%.%.%.│ │***#########**********.........%%%.%%......*************#*#######********************........%.%%%%.│ │****#######**********..........%%%%.........************#########*******************.........%%.%%.%│ │**#######************..........%%%%%%%........*************###*###*****************..........%%%%%%.│ │*##*####************...........%%%%%%%.........***********########***************............%%%%%%.│ │*#######************...........%%%%%%%...........***********#######*************.............%%%%%%.│ │*##*####***********............%%.%%%%%...........***********####**************.............%%%%%%%.│ │*#####*#**********..............%%%%%%%............**********##*###***********...............%%%%%..│ │#######***********.............%.%%%%%%..............********#######*********..............%%%%.%%..│ │#####*#**********...............%%%%%%%...............*******#######*******................%%%%%%%%.│ │###*#*#**********...............%%%%%%%%%..............*******######******.................%%%%%%...│ │#######*********.................%%%%%%%%................****###*###*****.................%%%%%%....│ │######**********.................%%%%%%%%%................***#*###******................%%%%%%%%%...│ │*#*##*#********...................%%%%%%%%%%................**######***..................%%%%%%.....│ │#****##********....................%%%%%%%%%.................**###*#*.................%.%%%%%%%.....│ │**************.....................%.%%%%%%...................******...................%.%%.%%......│ │*************........................%..%%%%%%%.................***...............%.%%%%%%%%%.......│ │*************.........................%.%%%.%%%%.................*................%%%%%%%.%.%.......│ │************............................%..%%%%..%................................%%%%%%%%..........│ │************.............................%%%%%%%%%%%........................%%..%%%%%%%%.%..........│ │***********..............................%%.%%%%%%%%..%....................%..%%%.%%%%%%%...........│ │***********.................................%%%%.%%%%%%%%...............%.%%%%%%%%%%%%.%............│ │**********...................................%%%%%%%%%%%%%%%%%%%%%%.%%%%.%%%%%%%%%%%%%..............│ │**********....................................%%.%%%%%%%%%%%%%%%%%%%%%%.%%%%%%%%%%%.................│ │*********.........................................%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%...................│ │********.............................................%%%.%%%%%%%%%%%%%%%%%%%%%......................│ │********................................................%...%%%%.%%.%%%%..%.........................│ └────────────────────────────────────────────────────────────────────────────────────────────────────┘ "/usr/bin/env" "bash" "-c" "opam exec -- dune build @install @check @runtest && rm -rf _build" failed with exit status 1 2025-05-22 20:06.18: Job failed: Failed: Build failed