2025-05-15 13:37.49: New job: test ahrefs/ocannl https://github.com/ahrefs/ocannl.git#refs/heads/master (9156a204994bc18bf1bb375e2a01c887bd47fc08) (linux-x86_64:fedora-42-5.3_opam-2.3) Base: ocaml/opam:fedora-42-ocaml-5.3@sha256:340ef8413fe195bbf54fd669127e946ef9bf60ec9b789cfee1f165e261f69373 Opam project build To reproduce locally: git clone --recursive "https://github.com/ahrefs/ocannl.git" -b "master" && cd "ocannl" && git reset --hard 9156a204 cat > Dockerfile <<'END-OF-DOCKERFILE' FROM ocaml/opam:fedora-42-ocaml-5.3@sha256:340ef8413fe195bbf54fd669127e946ef9bf60ec9b789cfee1f165e261f69373 # fedora-42-5.3_opam-2.3 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" WORKDIR /src RUN sudo dnf install -y findutils RUN sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version WORKDIR /src RUN sudo chown opam /src RUN cd ~/opam-repository && (git cat-file -e 997e4758ac95ae5ee2ee30125e6ba0dba68cebf0 || git fetch origin master) && git reset -q --hard 997e4758ac95ae5ee2ee30125e6ba0dba68cebf0 && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 neural_nets_lib.opam arrayjit.opam ./ RUN opam pin add -yn neural_nets_lib.dev './' && \ opam pin add -yn arrayjit.dev './' RUN echo '(lang dune 3.0)' > './dune-project' ENV DEPS="angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.18.2 dune-configurator.3.18.2 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /src RUN opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-05-15 13:37.49: Using cache hint "ahrefs/ocannl-ocaml/opam:fedora-42-ocaml-5.3@sha256:340ef8413fe195bbf54fd669127e946ef9bf60ec9b789cfee1f165e261f69373-fedora-42-5.3_opam-2.3-cdc9572ad54e4d4bf194acfcdfaa690c" 2025-05-15 13:37.49: Using OBuilder spec: ((from ocaml/opam:fedora-42-ocaml-5.3@sha256:340ef8413fe195bbf54fd669127e946ef9bf60ec9b789cfee1f165e261f69373) (comment fedora-42-5.3_opam-2.3) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (workdir /src) (run (network host) (shell "sudo dnf install -y findutils")) (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (workdir /src) (run (shell "sudo chown opam /src")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 997e4758ac95ae5ee2ee30125e6ba0dba68cebf0 || git fetch origin master) && git reset -q --hard 997e4758ac95ae5ee2ee30125e6ba0dba68cebf0 && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) (env DEPS "angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.18.2 dune-configurator.3.18.2 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /src)) (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-05-15 13:37.49: Waiting for resource in pool OCluster 2025-05-15 13:37.50: Waiting for worker… 2025-05-15 13:48.35: Got resource from pool OCluster Building on laodoke.caelum.ci.dev HEAD is now at 3718ebe9 Bug fix pp_array_offset (AI slop) HEAD is now at 9156a204 Logging test harness (from ocaml/opam:fedora-42-ocaml-5.3@sha256:340ef8413fe195bbf54fd669127e946ef9bf60ec9b789cfee1f165e261f69373) 2025-05-15 13:49.49 ---> saved as "58e80f2943667cc892930b8f00145b341640b9631e46b0e990690977929d47d4" /: (comment fedora-42-5.3_opam-2.3) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (workdir /src) /src: (run (network host) (shell "sudo dnf install -y findutils")) Updating and loading repositories: Fedora 42 - x86_64 - Updates 100% | 56.9 KiB/s | 18.8 KiB | 00m00s Fedora 42 - x86_64 - Updates 100% | 2.6 MiB/s | 2.3 MiB | 00m01s Repositories loaded. Package "findutils-1:4.10.0-5.fc42.x86_64" is already installed. Nothing to do. 2025-05-15 13:49.56 ---> saved as "5e75f4f90bede632f1905a7c2c6dd18b38094c39f21cf56cfd80e66f2a5ea1fb" /src: (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) 2025-05-15 13:49.57 ---> saved as "99c48dd0c90841dd0557a3605ead22147b57247e14e3720a707113a3fd5cb79d" /src: (run (shell "opam init --reinit -ni")) Configuring from /home/opam/.opamrc and then from built-in defaults. Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. This development version of opam requires an update to the layout of /home/opam/.opam from version 2.0 to version 2.2, which can't be reverted. You may want to back it up before going further. Continue? [y/n] y Format upgrade done. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [default] Initialised 2025-05-15 13:50.49 ---> saved as "f466099dbad0be430d5d88b3056e1b5546981294ac309650866ef1379cd6d212" /src: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) Linux 5.15.0-139-generic The OCaml toplevel, version 5.3.0 2.3.0 2025-05-15 13:50.49 ---> saved as "9495bb6675cefd0fde9c3d3172875d2f69d43b4b48aa2c1fc3cbbdc7685c23a4" /src: (workdir /src) /src: (run (shell "sudo chown opam /src")) 2025-05-15 13:50.49 ---> saved as "32bd88dfd4a72ae99f6d3d856e0c727304b5b119984d04b7c88a6f9d56b49fb7" /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 997e4758ac95ae5ee2ee30125e6ba0dba68cebf0 || git fetch origin master) && git reset -q --hard 997e4758ac95ae5ee2ee30125e6ba0dba68cebf0 && git log --no-decorate -n1 --oneline && opam update -u")) 997e4758ac Merge pull request #27839 from public-release/opam-publish-base.v0.17.2 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [default] synchronised from git+file:///home/opam/opam-repository Everything as up-to-date as possible (run with --verbose to show unavailable upgrades). However, you may "opam upgrade" these packages explicitly, which will ask permission to downgrade or uninstall the conflicting packages. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-05-15 13:51.14 ---> saved as "258d56210f3e71e9897edd6a05b53422b4d8b3fc50c99e4e525a14464782f7cb" /src: (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) 2025-05-15 13:51.14 ---> saved as "488e452321e667b2c669dfc12a8ada5a6310bbee470acf388851e055b66a2173" /src: (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) [neural_nets_lib.dev] synchronised (file:///src) neural_nets_lib is now pinned to file:///src (version dev) [arrayjit.dev] synchronised (file:///src) arrayjit is now pinned to file:///src (version dev) 2025-05-15 13:51.16 ---> saved as "8fa86840d1e6aa4b43a4e44d7dd54fd178009a5a8b2207e2f35552de65a7b30f" /src: (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) 2025-05-15 13:51.16 ---> saved as "5a4511450e9e99a38c72c663d411bec8074130faebe05d7fce0da8e8c5c58f5f" /src: (env DEPS "angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.18.2 dune-configurator.3.18.2 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") /src: (env CI true) /src: (env OCAMLCI true) /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) + /usr/sbin/sudo "yum" "makecache" - Updating and loading repositories: - Repositories loaded. - Metadata cache created. <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [arrayjit.dev] synchronised (file:///src) [neural_nets_lib.dev] synchronised (file:///src) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following system packages will first need to be installed: libffi-devel <><> Handling external dependencies <><><><><><><><><><><><><><><><><><><><><><> + /usr/sbin/sudo "yum" "install" "-y" "libffi-devel" - Updating and loading repositories: - Repositories loaded. - Package Arch Version Repository Size - Installing: - libffi-devel x86_64 3.4.6-5.fc42 fedora 33.1 KiB - - Transaction Summary: - Installing: 1 package - - Total size of inbound packages is 29 KiB. Need to download 29 KiB. - After this operation, 33 KiB extra will be used (install 33 KiB, remove 0 B). - [1/1] libffi-devel-0:3.4.6-5.fc42.x86_6 100% | 306.2 KiB/s | 28.8 KiB | 00m00s - -------------------------------------------------------------------------------- - [1/1] Total 100% | 103.2 KiB/s | 28.8 KiB | 00m00s - Running transaction - [1/3] Verify package files 100% | 0.0 B/s | 1.0 B | 00m00s - [2/3] Prepare transaction 100% | 32.0 B/s | 1.0 B | 00m00s - [3/3] Installing libffi-devel-0:3.4.6-5 100% | 579.9 KiB/s | 34.8 KiB | 00m00s - Complete! + /usr/sbin/rpm "-q" "--whatprovides" "libffi-devel" - libffi-devel-3.4.6-5.fc42.x86_64 2025-05-15 13:51.30 ---> saved as "1386309c1462d1dcd62997026eeace65f0e381655874263b8a00df6fbb6f08f5" /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following actions will be performed: === install 75 packages - install angstrom 0.16.1 - install astring 0.8.5 - install backoff 0.1.1 - install base v0.17.2 - install bigarray-compat 1.1.0 - install bigstringaf 0.10.0 - install camlp-streams 5.0.1 - install cmdliner 1.3.0 - install conf-libffi 2.0.0 - install conf-pkg-config 4 - install cppo 1.8.0 - install csexp 1.5.2 - install ctypes 0.23.0 - install ctypes-foreign 0.23.0 - install dune 3.18.2 - install dune-configurator 3.18.2 - install fieldslib v0.17.0 - install fmt 0.10.0 - install integers 0.7.0 - install jane-street-headers v0.17.0 - install jst-config v0.17.0 - install logs 0.8.0 - install mdx 2.5.0 - install mtime 2.1.0 - install multicore-magic 2.3.1 - install num 1.5-1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml-version 4.0.0 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.16.1 - install ocamlfind 1.9.8 - install parsexp v0.17.0 - install pprint 20230830 - install ppx_assert v0.17.0 - install ppx_base v0.17.0 - install ppx_cold v0.17.0 - install ppx_compare v0.17.0 - install ppx_derivers 1.2.1 - install ppx_deriving 6.0.3 - install ppx_enumerate v0.17.0 - install ppx_expect v0.17.2 - install ppx_fields_conv v0.17.0 - install ppx_globalize v0.17.0 - install ppx_hash v0.17.0 - install ppx_here v0.17.0 - install ppx_inline_test v0.17.0 - install ppx_minidebug 2.2.0 - install ppx_optcomp v0.17.0 - install ppx_sexp_conv v0.17.0 - install ppx_string v0.17.0 - install ppx_variants_conv v0.17.0 - install ppxlib 0.35.0 - install ppxlib_jane v0.17.2 - install printbox 0.12 - install printbox-ext-plot 0.12 - install printbox-html 0.12 - install printbox-md 0.12 - install printbox-text 0.12 - install ptime 1.2.0 - install re 1.12.0 - install result 1.5 - install saturn_lockfree 0.5.0 - install seq base - install sexplib v0.17.0 - install sexplib0 v0.17.0 - install stdio v0.17.0 - install stdlib-shims 0.3.0 - install thread-local-storage 0.2 - install time_now v0.17.0 - install topkg 1.0.8 - install tyxml 4.6.0 - install uucp 16.0.0 - install uutf 1.0.4 - install variantslib v0.17.0 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved backoff.0.1.1 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved astring.0.8.5 (cached) -> retrieved bigarray-compat.1.1.0 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved base.v0.17.2 (cached) -> retrieved camlp-streams.5.0.1 (cached) -> retrieved cmdliner.1.3.0 (cached) -> retrieved cppo.1.8.0 (cached) -> installed conf-pkg-config.4 -> retrieved csexp.1.5.2 (cached) -> retrieved ctypes.0.23.0, ctypes-foreign.0.23.0 (cached) -> installed conf-libffi.2.0.0 -> retrieved fieldslib.v0.17.0 (cached) -> retrieved fmt.0.10.0 (cached) -> retrieved integers.0.7.0 (cached) -> retrieved jane-street-headers.v0.17.0 (cached) -> retrieved jst-config.v0.17.0 (cached) -> retrieved logs.0.8.0 (cached) -> retrieved mtime.2.1.0 (cached) -> retrieved mdx.2.5.0 (cached) -> retrieved multicore-magic.2.3.1 (cached) -> retrieved num.1.5-1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml-version.4.0.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ocamlbuild.0.16.1 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved parsexp.v0.17.0 (cached) -> retrieved dune.3.18.2, dune-configurator.3.18.2 (cached) -> retrieved pprint.20230830 (cached) -> retrieved ppx_assert.v0.17.0 (cached) -> retrieved ppx_base.v0.17.0 (cached) -> retrieved ppx_cold.v0.17.0 (cached) -> retrieved ppx_compare.v0.17.0 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved ppx_enumerate.v0.17.0 (cached) -> retrieved ppx_deriving.6.0.3 (cached) -> retrieved ppx_expect.v0.17.2 (cached) -> retrieved ppx_fields_conv.v0.17.0 (cached) -> retrieved ppx_globalize.v0.17.0 (cached) -> retrieved ppx_hash.v0.17.0 (cached) -> installed cmdliner.1.3.0 -> installed num.1.5-1 -> retrieved ppx_here.v0.17.0 (cached) -> retrieved ppx_inline_test.v0.17.0 (cached) -> retrieved ppx_minidebug.2.2.0 (cached) -> retrieved ppx_optcomp.v0.17.0 (cached) -> retrieved ppx_sexp_conv.v0.17.0 (cached) -> retrieved ppx_string.v0.17.0 (cached) -> retrieved ppx_variants_conv.v0.17.0 (cached) -> retrieved ppxlib_jane.v0.17.2 (cached) -> retrieved ptime.1.2.0 (cached) -> retrieved re.1.12.0 (cached) -> retrieved result.1.5 (cached) -> retrieved saturn_lockfree.0.5.0 (cached) -> retrieved seq.base (cached) -> installed seq.base -> retrieved sexplib.v0.17.0 (cached) -> retrieved sexplib0.v0.17.0 (cached) -> retrieved ppxlib.0.35.0 (cached) -> retrieved stdio.v0.17.0 (cached) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved thread-local-storage.0.2 (cached) -> retrieved time_now.v0.17.0 (cached) -> retrieved topkg.1.0.8 (cached) -> retrieved tyxml.4.6.0 (cached) -> retrieved printbox.0.12, printbox-ext-plot.0.12, printbox-html.0.12, printbox-md.0.12, printbox-text.0.12 (cached) -> retrieved uutf.1.0.4 (cached) -> retrieved variantslib.v0.17.0 (cached) -> retrieved uucp.16.0.0 (cached) -> installed ocamlfind.1.9.8 -> installed ocamlbuild.0.16.1 -> installed topkg.1.0.8 -> installed uutf.1.0.4 -> installed mtime.2.1.0 -> installed fmt.0.10.0 -> installed ptime.1.2.0 -> installed astring.0.8.5 -> installed logs.0.8.0 -> installed dune.3.18.2 -> installed ppx_derivers.1.2.1 -> installed jane-street-headers.v0.17.0 -> installed printbox.0.12 -> installed ocaml-version.4.0.0 -> installed result.1.5 -> installed csexp.1.5.2 -> installed backoff.0.1.1 -> installed bigarray-compat.1.1.0 -> installed camlp-streams.5.0.1 -> installed multicore-magic.2.3.1 -> installed cppo.1.8.0 -> installed ocaml-compiler-libs.v0.17.0 -> installed ocaml-syntax-shims.1.0.0 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed pprint.20230830 -> installed re.1.12.0 -> installed sexplib0.v0.17.0 -> installed stdlib-shims.0.3.0 -> installed thread-local-storage.0.2 -> installed saturn_lockfree.0.5.0 -> installed integers.0.7.0 -> installed dune-configurator.3.18.2 -> installed parsexp.v0.17.0 -> installed bigstringaf.0.10.0 -> installed mdx.2.5.0 -> installed angstrom.0.16.1 -> installed sexplib.v0.17.0 -> installed tyxml.4.6.0 -> installed printbox-html.0.12 -> installed ctypes.0.23.0 -> installed base.v0.17.2 -> installed variantslib.v0.17.0 -> installed fieldslib.v0.17.0 -> installed ctypes-foreign.0.23.0 -> installed stdio.v0.17.0 -> installed uucp.16.0.0 -> installed printbox-text.0.12 -> installed printbox-md.0.12 -> installed printbox-ext-plot.0.12 -> installed ppxlib.0.35.0 -> installed ppxlib_jane.v0.17.2 -> installed ppx_optcomp.v0.17.0 -> installed ppx_here.v0.17.0 -> installed ppx_variants_conv.v0.17.0 -> installed ppx_cold.v0.17.0 -> installed ppx_fields_conv.v0.17.0 -> installed ppx_enumerate.v0.17.0 -> installed ppx_globalize.v0.17.0 -> installed ppx_deriving.6.0.3 -> installed ppx_compare.v0.17.0 -> installed ppx_sexp_conv.v0.17.0 -> installed ppx_hash.v0.17.0 -> installed ppx_assert.v0.17.0 -> installed ppx_base.v0.17.0 -> installed ppx_minidebug.2.2.0 -> installed jst-config.v0.17.0 -> installed ppx_string.v0.17.0 -> installed time_now.v0.17.0 -> installed ppx_inline_test.v0.17.0 -> installed ppx_expect.v0.17.2 Done. # To update the current shell environment, run: eval $(opam env) 2025-05-15 13:54.00 ---> saved as "09ac8a187b2b0d600987c09f26ba2808673f4138e88e25a28ffa60ec5c83604f" /src: (copy (src .) (dst /src)) 2025-05-15 13:54.01 ---> saved as "a5120a3495910c7d1ebe6f9c813c08885b7b56e37fc4c7e2e84253b998f16a56" /src: (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test_ppx && ./test_ppx_op_expected.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test_ppx && ./test_ppx_op.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition '' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/8263cb49b9f5b047b74ab69c520c6b77/default/test/ocannl_config.' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Found 0, in the config file' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Retrieving commandline, environment, or config file variable ocannl_log_level' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file File "test/dune", lines 19-41, characters 0-727: 19 | (rule 20 | (target 21 | (dir log_files)) .... 39 | "\\1[0]{=MAYBE UNINITIALIZED} = " 40 | "log_files/micrograd_demo_logging-g_gradient_update.log" 41 | "log_files/micrograd_demo_logging-g_gradient_update.log")))) (cd _build/default/test && ./micrograd_demo_logging.exe --ocannl_debug_backend=text --ocannl_log_file_stem=micrograd_demo_logging --ocannl_log_main_domain_to_stdout=false --ocannl_debug_log_to_routine_files=overwrite) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file Retrieving commandline, environment, or config file variable ocannl_backend Found cc, in the config file Retrieving commandline, environment, or config file variable ocannl_cd_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_debug_log_to_routine_files Found overwrite, commandline --ocannl_debug_log_to_routine_files=overwrite Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Fatal error: exception File "src/printbox-text/PrintBox_text.ml", line 212, characters 6-12: Assertion failed Raised at PrintBox_text.Output.Make_out.to_buf_aux_ in file "src/printbox-text/PrintBox_text.ml", line 212, characters 6-50 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 19-42 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from PrintBox_text.Output.Make_out.render in file "src/printbox-text/PrintBox_text.ml", line 242, characters 14-64 Called from PrintBox_text.output in file "src/printbox-text/PrintBox_text.ml", line 851, characters 2-31 Called from Minidebug_runtime.PrintBox.output_box in file "minidebug_runtime.ml", line 1527, characters 19-59 Called from Minidebug_runtime.PrintBox.close_log_impl.close_tree in file "minidebug_runtime.ml", line 1572, characters 6-38 Called from Backends.Add_buffer_retrieval_and_syncing.sync_routine in file "arrayjit/lib/backends.ml", lines 144-172, characters 31-82 Called from Backends.Raise_backend.link in file "arrayjit/lib/backends.ml", lines 454-455, characters 4-92 Re-raised at Backends.Raise_backend.link in file "arrayjit/lib/backends.ml", lines 441-455, characters 23-92 Called from Dune__exe__Micrograd_demo_logging in file "test/micrograd_demo_logging.ml", line 34, characters 13-77 (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition hello_world_op.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition primitive_ops.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition zero2hero_1of7.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition einsum_trivia.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition micrograd_demo.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition moons_demo_parallel.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/3e82f2ce96578ddc31d79976d78a0758/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test && ./moons_demo_parallel_run.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file ("Set log_level to" 1) └─{orphaned from #2} Retrieving commandline, environment, or config file variable ocannl_backend Found cc, in the config file Properties of devices: (multicore_devices (device ((device_name CPU) (device_ordinal 0) (num_domains 72)))) @!Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Retrieving commandline, environment, or config file variable ocannl_never_capture_stdout Not found, using default false Batch=59, step=60, lr=0.199750, batch loss=23.609453, epoch loss=23.609453 Batch=119, step=120, lr=0.199500, batch loss=8.516926, epoch loss=32.126379 Batch=179, step=180, lr=0.199500, batch loss=2.639251, epoch loss=34.765630 Batch=239, step=240, lr=0.199250, batch loss=0.850854, epoch loss=35.616485 Batch=299, step=300, lr=0.199000, batch loss=1.448342, epoch loss=37.064827 Batch=359, step=360, lr=0.198750, batch loss=1.335675, epoch loss=38.400502 Batch=419, step=420, lr=0.198500, batch loss=0.615054, epoch loss=39.015555 Batch=479, step=480, lr=0.198000, batch loss=0.811093, epoch loss=39.826648 Batch=539, step=540, lr=0.197750, batch loss=0.707658, epoch loss=40.534307 Batch=599, step=600, lr=0.197750, batch loss=1.069258, epoch loss=41.603564 Batch=659, step=660, lr=0.197250, batch loss=0.482481, epoch loss=42.086045 Batch=719, step=720, lr=0.197250, batch loss=0.411202, epoch loss=42.497247 Batch=779, step=780, lr=0.197000, batch loss=0.469348, epoch loss=42.966595 Batch=839, step=840, lr=0.196500, batch loss=0.443965, epoch loss=43.410560 Batch=899, step=900, lr=0.196250, batch loss=0.383071, epoch loss=43.793631 Batch=959, step=960, lr=0.196250, batch loss=0.241779, epoch loss=44.035409 Batch=1019, step=1020, lr=0.195750, batch loss=0.452294, epoch loss=44.487703 Batch=1079, step=1080, lr=0.195500, batch loss=0.258814, epoch loss=44.746517 Batch=1139, step=1140, lr=0.195500, batch loss=0.339650, epoch loss=45.086167 Batch=1199, step=1200, lr=0.195250, batch loss=0.261408, epoch loss=45.347575 Epoch=0, step=1200, lr=0.195250, epoch loss=45.347575 Batch=59, step=1260, lr=0.195000, batch loss=0.260714, epoch loss=0.260714 Batch=119, step=1320, lr=0.194750, batch loss=0.203006, epoch loss=0.463720 Batch=179, step=1380, lr=0.194500, batch loss=0.244612, epoch loss=0.708332 Batch=239, step=1440, lr=0.194250, batch loss=0.348443, epoch loss=1.056776 Batch=299, step=1500, lr=0.194000, batch loss=0.238149, epoch loss=1.294925 Batch=359, step=1560, lr=0.193750, batch loss=0.312942, epoch loss=1.607866 Batch=419, step=1620, lr=0.193500, batch loss=0.309987, epoch loss=1.917853 Batch=479, step=1680, lr=0.193250, batch loss=0.276755, epoch loss=2.194607 Batch=539, step=1740, lr=0.193000, batch loss=0.210713, epoch loss=2.405320 Batch=599, step=1800, lr=0.192750, batch loss=0.252119, epoch loss=2.657440 Batch=659, step=1860, lr=0.192500, batch loss=0.369162, epoch loss=3.026601 Batch=719, step=1920, lr=0.192250, batch loss=0.365706, epoch loss=3.392307 Batch=779, step=1980, lr=0.192000, batch loss=0.383327, epoch loss=3.775635 Batch=839, step=2040, lr=0.191750, batch loss=0.340697, epoch loss=4.116332 Batch=899, step=2100, lr=0.191500, batch loss=0.295747, epoch loss=4.412078 Batch=959, step=2160, lr=0.191250, batch loss=0.214098, epoch loss=4.626177 Batch=1019, step=2220, lr=0.191000, batch loss=0.332362, epoch loss=4.958538 Batch=1079, step=2280, lr=0.190750, batch loss=0.205644, epoch loss=5.164182 Batch=1139, step=2340, lr=0.190500, batch loss=0.262431, epoch loss=5.426613 Batch=1199, step=2400, lr=0.190000, batch loss=0.212991, epoch loss=5.639605 Epoch=1, step=2400, lr=0.190000, epoch loss=5.639605 Batch=59, step=2460, lr=0.190000, batch loss=0.231957, epoch loss=0.231957 Batch=119, step=2520, lr=0.189750, batch loss=0.195879, epoch loss=0.427836 Batch=179, step=2580, lr=0.189500, batch loss=0.220051, epoch loss=0.647887 Batch=239, step=2640, lr=0.189250, batch loss=0.327250, epoch loss=0.975137 Batch=299, step=2700, lr=0.188750, batch loss=0.206437, epoch loss=1.181574 Batch=359, step=2760, lr=0.188500, batch loss=0.293094, epoch loss=1.474668 Batch=419, step=2820, lr=0.188500, batch loss=0.283100, epoch loss=1.757768 Batch=479, step=2880, lr=0.188250, batch loss=0.263431, epoch loss=2.021199 Batch=539, step=2940, lr=0.188000, batch loss=0.206387, epoch loss=2.227586 Batch=599, step=3000, lr=0.187750, batch loss=0.249819, epoch loss=2.477405 Batch=659, step=3060, lr=0.187500, batch loss=0.347636, epoch loss=2.825041 Batch=719, step=3120, lr=0.187250, batch loss=0.349326, epoch loss=3.174367 Batch=779, step=3180, lr=0.187000, batch loss=0.363625, epoch loss=3.537992 Batch=839, step=3240, lr=0.186500, batch loss=0.324967, epoch loss=3.862959 Batch=899, step=3300, lr=0.186500, batch loss=0.295477, epoch loss=4.158436 Batch=959, step=3360, lr=0.186250, batch loss=0.230590, epoch loss=4.389026 Batch=1019, step=3420, lr=0.186000, batch loss=0.344751, epoch loss=4.733777 Batch=1079, step=3480, lr=0.185750, batch loss=0.221193, epoch loss=4.954970 Batch=1139, step=3540, lr=0.185500, batch loss=0.261053, epoch loss=5.216023 Batch=1199, step=3600, lr=0.185000, batch loss=0.199922, epoch loss=5.415945 Epoch=2, step=3600, lr=0.185000, epoch loss=5.415945 Batch=59, step=3660, lr=0.185000, batch loss=0.222959, epoch loss=0.222959 Batch=119, step=3720, lr=0.184750, batch loss=0.184847, epoch loss=0.407806 Batch=179, step=3780, lr=0.184500, batch loss=0.212674, epoch loss=0.620480 Batch=239, step=3840, lr=0.184250, batch loss=0.318135, epoch loss=0.938616 Batch=299, step=3900, lr=0.184000, batch loss=0.212724, epoch loss=1.151340 Batch=359, step=3960, lr=0.183750, batch loss=0.292637, epoch loss=1.443977 Batch=419, step=4020, lr=0.183500, batch loss=0.301318, epoch loss=1.745294 Batch=479, step=4080, lr=0.183250, batch loss=0.258221, epoch loss=2.003516 Batch=539, step=4140, lr=0.183000, batch loss=0.196059, epoch loss=2.199575 Batch=599, step=4200, lr=0.182750, batch loss=0.237339, epoch loss=2.436914 Batch=659, step=4260, lr=0.182500, batch loss=0.331794, epoch loss=2.768708 Batch=719, step=4320, lr=0.182250, batch loss=0.327096, epoch loss=3.095803 Batch=779, step=4380, lr=0.181750, batch loss=0.348172, epoch loss=3.443975 Batch=839, step=4440, lr=0.181750, batch loss=0.317895, epoch loss=3.761871 Batch=899, step=4500, lr=0.181500, batch loss=0.288062, epoch loss=4.049933 Batch=959, step=4560, lr=0.181250, batch loss=0.236928, epoch loss=4.286861 Batch=1019, step=4620, lr=0.181000, batch loss=0.331209, epoch loss=4.618070 Batch=1079, step=4680, lr=0.180750, batch loss=0.194844, epoch loss=4.812914 Batch=1139, step=4740, lr=0.180500, batch loss=0.228524, epoch loss=5.041438 Batch=1199, step=4800, lr=0.180250, batch loss=0.193535, epoch loss=5.234973 Epoch=3, step=4800, lr=0.180250, epoch loss=5.234973 Batch=59, step=4860, lr=0.180000, batch loss=0.234346, epoch loss=0.234346 Batch=119, step=4920, lr=0.179750, batch loss=0.194562, epoch loss=0.428908 Batch=179, step=4980, lr=0.179500, batch loss=0.206500, epoch loss=0.635408 Batch=239, step=5040, lr=0.179250, batch loss=0.307623, epoch loss=0.943031 Batch=299, step=5100, lr=0.179000, batch loss=0.208665, epoch loss=1.151697 Batch=359, step=5160, lr=0.178750, batch loss=0.274035, epoch loss=1.425732 Batch=419, step=5220, lr=0.178500, batch loss=0.266341, epoch loss=1.692072 Batch=479, step=5280, lr=0.178250, batch loss=0.241358, epoch loss=1.933431 Batch=539, step=5340, lr=0.178000, batch loss=0.189476, epoch loss=2.122907 Batch=599, step=5400, lr=0.177750, batch loss=0.231590, epoch loss=2.354497 Batch=659, step=5460, lr=0.177500, batch loss=0.323913, epoch loss=2.678410 Batch=719, step=5520, lr=0.177250, batch loss=0.320635, epoch loss=2.999044 Batch=779, step=5580, lr=0.177000, batch loss=0.342744, epoch loss=3.341788 Batch=839, step=5640, lr=0.176750, batch loss=0.310771, epoch loss=3.652559 Batch=899, step=5700, lr=0.176500, batch loss=0.271034, epoch loss=3.923594 Batch=959, step=5760, lr=0.176250, batch loss=0.216334, epoch loss=4.139928 Batch=1019, step=5820, lr=0.176000, batch loss=0.333772, epoch loss=4.473700 Batch=1079, step=5880, lr=0.175750, batch loss=0.190343, epoch loss=4.664042 Batch=1139, step=5940, lr=0.175500, batch loss=0.224084, epoch loss=4.888126 Batch=1199, step=6000, lr=0.175250, batch loss=0.189344, epoch loss=5.077470 Epoch=4, step=6000, lr=0.175250, epoch loss=5.077470 Batch=59, step=6060, lr=0.175000, batch loss=0.233709, epoch loss=0.233709 Batch=119, step=6120, lr=0.174750, batch loss=0.191887, epoch loss=0.425595 Batch=179, step=6180, lr=0.174500, batch loss=0.201775, epoch loss=0.627370 Batch=239, step=6240, lr=0.174250, batch loss=0.299543, epoch loss=0.926913 Batch=299, step=6300, lr=0.174000, batch loss=0.198641, epoch loss=1.125554 Batch=359, step=6360, lr=0.173750, batch loss=0.265092, epoch loss=1.390646 Batch=419, step=6420, lr=0.173500, batch loss=0.258349, epoch loss=1.648995 Batch=479, step=6480, lr=0.173250, batch loss=0.232420, epoch loss=1.881415 Batch=539, step=6540, lr=0.173000, batch loss=0.190244, epoch loss=2.071659 Batch=599, step=6600, lr=0.172750, batch loss=0.233329, epoch loss=2.304988 Batch=659, step=6660, lr=0.172500, batch loss=0.313810, epoch loss=2.618798 Batch=719, step=6720, lr=0.172250, batch loss=0.316918, epoch loss=2.935716 Batch=779, step=6780, lr=0.172000, batch loss=0.331161, epoch loss=3.266876 Batch=839, step=6840, lr=0.171750, batch loss=0.306765, epoch loss=3.573642 Batch=899, step=6900, lr=0.171500, batch loss=0.269234, epoch loss=3.842876 Batch=959, step=6960, lr=0.171250, batch loss=0.207287, epoch loss=4.050164 Batch=1019, step=7020, lr=0.170750, batch loss=0.327274, epoch loss=4.377438 Batch=1079, step=7080, lr=0.170750, batch loss=0.175560, epoch loss=4.552998 Batch=1139, step=7140, lr=0.170500, batch loss=0.212914, epoch loss=4.765912 Batch=1199, step=7200, lr=0.170250, batch loss=0.183058, epoch loss=4.948970 Epoch=5, step=7200, lr=0.170250, epoch loss=4.948970 Batch=59, step=7260, lr=0.170000, batch loss=0.226138, epoch loss=0.226138 Batch=119, step=7320, lr=0.169750, batch loss=0.189386, epoch loss=0.415524 Batch=179, step=7380, lr=0.169500, batch loss=0.194449, epoch loss=0.609973 Batch=239, step=7440, lr=0.169250, batch loss=0.291563, epoch loss=0.901536 Batch=299, step=7500, lr=0.169000, batch loss=0.207610, epoch loss=1.109146 Batch=359, step=7560, lr=0.168750, batch loss=0.263992, epoch loss=1.373138 Batch=419, step=7620, lr=0.168500, batch loss=0.258258, epoch loss=1.631396 Batch=479, step=7680, lr=0.168250, batch loss=0.234736, epoch loss=1.866132 Batch=539, step=7740, lr=0.168000, batch loss=0.190784, epoch loss=2.056916 Batch=599, step=7800, lr=0.167750, batch loss=0.228893, epoch loss=2.285808 Batch=659, step=7860, lr=0.167500, batch loss=0.304467, epoch loss=2.590275 Batch=719, step=7920, lr=0.167250, batch loss=0.309882, epoch loss=2.900157 Batch=779, step=7980, lr=0.166750, batch loss=0.328717, epoch loss=3.228874 Batch=839, step=8040, lr=0.166750, batch loss=0.293141, epoch loss=3.522015 Batch=899, step=8100, lr=0.166500, batch loss=0.262954, epoch loss=3.784970 Batch=959, step=8160, lr=0.166250, batch loss=0.198037, epoch loss=3.983007 Batch=1019, step=8220, lr=0.166000, batch loss=0.325189, epoch loss=4.308196 Batch=1079, step=8280, lr=0.165750, batch loss=0.192343, epoch loss=4.500539 Batch=1139, step=8340, lr=0.165500, batch loss=0.217862, epoch loss=4.718401 Batch=1199, step=8400, lr=0.165250, batch loss=0.173217, epoch loss=4.891618 Epoch=6, step=8400, lr=0.165250, epoch loss=4.891618 Batch=59, step=8460, lr=0.165000, batch loss=0.211993, epoch loss=0.211993 Batch=119, step=8520, lr=0.164750, batch loss=0.171987, epoch loss=0.383980 Batch=179, step=8580, lr=0.164500, batch loss=0.187827, epoch loss=0.571808 Batch=239, step=8640, lr=0.164250, batch loss=0.280402, epoch loss=0.852209 Batch=299, step=8700, lr=0.163750, batch loss=0.197185, epoch loss=1.049394 Batch=359, step=8760, lr=0.163750, batch loss=0.254747, epoch loss=1.304141 Batch=419, step=8820, lr=0.163500, batch loss=0.247118, epoch loss=1.551259 Batch=479, step=8880, lr=0.163250, batch loss=0.227357, epoch loss=1.778616 Batch=539, step=8940, lr=0.163000, batch loss=0.180231, epoch loss=1.958847 Batch=599, step=9000, lr=0.162750, batch loss=0.216193, epoch loss=2.175040 Batch=659, step=9060, lr=0.162500, batch loss=0.294190, epoch loss=2.469229 Batch=719, step=9120, lr=0.162250, batch loss=0.296605, epoch loss=2.765834 Batch=779, step=9180, lr=0.162000, batch loss=0.315609, epoch loss=3.081443 Batch=839, step=9240, lr=0.161750, batch loss=0.281597, epoch loss=3.363040 Batch=899, step=9300, lr=0.161500, batch loss=0.253739, epoch loss=3.616779 Batch=959, step=9360, lr=0.161250, batch loss=0.197371, epoch loss=3.814150 Batch=1019, step=9420, lr=0.161000, batch loss=0.309563, epoch loss=4.123714 Batch=1079, step=9480, lr=0.160750, batch loss=0.194143, epoch loss=4.317856 Batch=1139, step=9540, lr=0.160500, batch loss=0.209494, epoch loss=4.527350 Batch=1199, step=9600, lr=0.160250, batch loss=0.167042, epoch loss=4.694392 Epoch=7, step=9600, lr=0.160250, epoch loss=4.694392 Batch=59, step=9660, lr=0.160000, batch loss=0.195193, epoch loss=0.195193 Batch=119, step=9720, lr=0.159750, batch loss=0.167784, epoch loss=0.362978 Batch=179, step=9780, lr=0.159500, batch loss=0.179814, epoch loss=0.542791 Batch=239, step=9840, lr=0.159250, batch loss=0.264381, epoch loss=0.807173 Batch=299, step=9900, lr=0.159000, batch loss=0.180861, epoch loss=0.988034 Batch=359, step=9960, lr=0.158750, batch loss=0.242197, epoch loss=1.230231 Batch=419, step=10020, lr=0.158500, batch loss=0.231669, epoch loss=1.461900 Batch=479, step=10080, lr=0.158250, batch loss=0.216210, epoch loss=1.678111 Batch=539, step=10140, lr=0.158000, batch loss=0.169399, epoch loss=1.847510 Batch=599, step=10200, lr=0.157500, batch loss=0.204662, epoch loss=2.052172 Batch=659, step=10260, lr=0.157500, batch loss=0.280648, epoch loss=2.332820 Batch=719, step=10320, lr=0.157250, batch loss=0.279963, epoch loss=2.612782 Batch=779, step=10380, lr=0.157000, batch loss=0.297437, epoch loss=2.910219 Batch=839, step=10440, lr=0.156750, batch loss=0.273160, epoch loss=3.183379 Batch=899, step=10500, lr=0.156500, batch loss=0.237787, epoch loss=3.421166 Batch=959, step=10560, lr=0.156250, batch loss=0.178025, epoch loss=3.599191 Batch=1019, step=10620, lr=0.156000, batch loss=0.284354, epoch loss=3.883545 Batch=1079, step=10680, lr=0.155750, batch loss=0.160825, epoch loss=4.044370 Batch=1139, step=10740, lr=0.155500, batch loss=0.187833, epoch loss=4.232203 Batch=1199, step=10800, lr=0.155250, batch loss=0.155104, epoch loss=4.387307 Epoch=8, step=10800, lr=0.155250, epoch loss=4.387307 Batch=59, step=10860, lr=0.155000, batch loss=0.179702, epoch loss=0.179702 Batch=119, step=10920, lr=0.154750, batch loss=0.150506, epoch loss=0.330208 Batch=179, step=10980, lr=0.154500, batch loss=0.166507, epoch loss=0.496715 Batch=239, step=11040, lr=0.154250, batch loss=0.242112, epoch loss=0.738827 Batch=299, step=11100, lr=0.154000, batch loss=0.172611, epoch loss=0.911437 Batch=359, step=11160, lr=0.153750, batch loss=0.221194, epoch loss=1.132631 Batch=419, step=11220, lr=0.153500, batch loss=0.217924, epoch loss=1.350556 Batch=479, step=11280, lr=0.153250, batch loss=0.211958, epoch loss=1.562513 Batch=539, step=11340, lr=0.153000, batch loss=0.164068, epoch loss=1.726581 Batch=599, step=11400, lr=0.152500, batch loss=0.178388, epoch loss=1.904969 Batch=659, step=11460, lr=0.152250, batch loss=0.265430, epoch loss=2.170399 Batch=719, step=11520, lr=0.152250, batch loss=0.263159, epoch loss=2.433558 Batch=779, step=11580, lr=0.152000, batch loss=0.273330, epoch loss=2.706889 Batch=839, step=11640, lr=0.151750, batch loss=0.251026, epoch loss=2.957914 Batch=899, step=11700, lr=0.151250, batch loss=0.220161, epoch loss=3.178075 Batch=959, step=11760, lr=0.151250, batch loss=0.183518, epoch loss=3.361593 Batch=1019, step=11820, lr=0.151000, batch loss=0.282868, epoch loss=3.644461 Batch=1079, step=11880, lr=0.150750, batch loss=0.154978, epoch loss=3.799438 Batch=1139, step=11940, lr=0.150500, batch loss=0.191845, epoch loss=3.991283 Batch=1199, step=12000, lr=0.150250, batch loss=0.139657, epoch loss=4.130940 Epoch=9, step=12000, lr=0.150250, epoch loss=4.130940 Batch=59, step=12060, lr=0.150000, batch loss=0.161169, epoch loss=0.161169 Batch=119, step=12120, lr=0.149750, batch loss=0.138190, epoch loss=0.299360 Batch=179, step=12180, lr=0.149500, batch loss=0.152009, epoch loss=0.451368 Batch=239, step=12240, lr=0.149000, batch loss=0.219858, epoch loss=0.671226 Batch=299, step=12300, lr=0.149000, batch loss=0.144693, epoch loss=0.815919 Batch=359, step=12360, lr=0.148750, batch loss=0.196343, epoch loss=1.012263 Batch=419, step=12420, lr=0.148500, batch loss=0.193425, epoch loss=1.205688 Batch=479, step=12480, lr=0.148250, batch loss=0.181788, epoch loss=1.387476 Batch=539, step=12540, lr=0.148000, batch loss=0.142093, epoch loss=1.529569 Batch=599, step=12600, lr=0.147750, batch loss=0.151644, epoch loss=1.681213 Batch=659, step=12660, lr=0.147500, batch loss=0.225609, epoch loss=1.906822 Batch=719, step=12720, lr=0.147250, batch loss=0.239347, epoch loss=2.146169 Batch=779, step=12780, lr=0.147000, batch loss=0.262896, epoch loss=2.409065 Batch=839, step=12840, lr=0.146500, batch loss=0.237784, epoch loss=2.646849 Batch=899, step=12900, lr=0.146500, batch loss=0.220983, epoch loss=2.867832 Batch=959, step=12960, lr=0.146250, batch loss=0.147859, epoch loss=3.015692 Batch=1019, step=13020, lr=0.146000, batch loss=0.243883, epoch loss=3.259575 Batch=1079, step=13080, lr=0.145750, batch loss=0.111114, epoch loss=3.370689 Batch=1139, step=13140, lr=0.145500, batch loss=0.148792, epoch loss=3.519481 Batch=1199, step=13200, lr=0.145250, batch loss=0.117617, epoch loss=3.637098 Epoch=10, step=13200, lr=0.145250, epoch loss=3.637098 Batch=59, step=13260, lr=0.145000, batch loss=0.141039, epoch loss=0.141039 Batch=119, step=13320, lr=0.144750, batch loss=0.121875, epoch loss=0.262914 Batch=179, step=13380, lr=0.144500, batch loss=0.129726, epoch loss=0.392640 Batch=239, step=13440, lr=0.144250, batch loss=0.195882, epoch loss=0.588522 Batch=299, step=13500, lr=0.144000, batch loss=0.127159, epoch loss=0.715681 Batch=359, step=13560, lr=0.143750, batch loss=0.162553, epoch loss=0.878234 Batch=419, step=13620, lr=0.143500, batch loss=0.162849, epoch loss=1.041084 Batch=479, step=13680, lr=0.143000, batch loss=0.148583, epoch loss=1.189667 Batch=539, step=13740, lr=0.143000, batch loss=0.120306, epoch loss=1.309973 Batch=599, step=13800, lr=0.142750, batch loss=0.121943, epoch loss=1.431916 Batch=659, step=13860, lr=0.142500, batch loss=0.177509, epoch loss=1.609425 Batch=719, step=13920, lr=0.142250, batch loss=0.178230, epoch loss=1.787655 Batch=779, step=13980, lr=0.142000, batch loss=0.200258, epoch loss=1.987914 Batch=839, step=14040, lr=0.141750, batch loss=0.187738, epoch loss=2.175651 Batch=899, step=14100, lr=0.141500, batch loss=0.160090, epoch loss=2.335742 Batch=959, step=14160, lr=0.141250, batch loss=0.135726, epoch loss=2.471468 Batch=1019, step=14220, lr=0.141000, batch loss=0.282499, epoch loss=2.753967 Batch=1079, step=14280, lr=0.140750, batch loss=0.091021, epoch loss=2.844988 Batch=1139, step=14340, lr=0.140500, batch loss=0.131211, epoch loss=2.976199 Batch=1199, step=14400, lr=0.140250, batch loss=0.093960, epoch loss=3.070159 Epoch=11, step=14400, lr=0.140250, epoch loss=3.070159 Batch=59, step=14460, lr=0.140000, batch loss=0.122163, epoch loss=0.122163 Batch=119, step=14520, lr=0.139750, batch loss=0.106239, epoch loss=0.228402 Batch=179, step=14580, lr=0.139500, batch loss=0.105144, epoch loss=0.333545 Batch=239, step=14640, lr=0.139250, batch loss=0.143634, epoch loss=0.477179 Batch=299, step=14700, lr=0.139000, batch loss=0.080518, epoch loss=0.557697 Batch=359, step=14760, lr=0.138750, batch loss=0.120577, epoch loss=0.678274 Batch=419, step=14820, lr=0.138500, batch loss=0.125366, epoch loss=0.803640 Batch=479, step=14880, lr=0.138000, batch loss=0.107597, epoch loss=0.911237 Batch=539, step=14940, lr=0.137750, batch loss=0.112318, epoch loss=1.023555 Batch=599, step=15000, lr=0.137750, batch loss=0.087471, epoch loss=1.111026 Batch=659, step=15060, lr=0.137500, batch loss=0.128926, epoch loss=1.239953 Batch=719, step=15120, lr=0.137000, batch loss=0.125467, epoch loss=1.365419 Batch=779, step=15180, lr=0.137000, batch loss=0.128702, epoch loss=1.494121 Batch=839, step=15240, lr=0.136750, batch loss=0.150958, epoch loss=1.645080 Batch=899, step=15300, lr=0.136500, batch loss=0.197714, epoch loss=1.842793 Batch=959, step=15360, lr=0.136250, batch loss=0.076869, epoch loss=1.919663 Batch=1019, step=15420, lr=0.136000, batch loss=0.162959, epoch loss=2.082622 Batch=1079, step=15480, lr=0.135750, batch loss=0.048442, epoch loss=2.131064 Batch=1139, step=15540, lr=0.135500, batch loss=0.112968, epoch loss=2.244032 Batch=1199, step=15600, lr=0.135250, batch loss=0.058905, epoch loss=2.302937 Epoch=12, step=15600, lr=0.135250, epoch loss=2.302937 Batch=59, step=15660, lr=0.135000, batch loss=0.090262, epoch loss=0.090262 Batch=119, step=15720, lr=0.134750, batch loss=0.155845, epoch loss=0.246107 Batch=179, step=15780, lr=0.134500, batch loss=0.107117, epoch loss=0.353224 Batch=239, step=15840, lr=0.134250, batch loss=0.102748, epoch loss=0.455972 Batch=299, step=15900, lr=0.134000, batch loss=0.050515, epoch loss=0.506488 Batch=359, step=15960, lr=0.133750, batch loss=0.102279, epoch loss=0.608767 Batch=419, step=16020, lr=0.133500, batch loss=0.078920, epoch loss=0.687687 Batch=479, step=16080, lr=0.133250, batch loss=0.059236, epoch loss=0.746924 Batch=539, step=16140, lr=0.133000, batch loss=0.061624, epoch loss=0.808548 Batch=599, step=16200, lr=0.132750, batch loss=0.133736, epoch loss=0.942284 Batch=659, step=16260, lr=0.132500, batch loss=0.088936, epoch loss=1.031220 Batch=719, step=16320, lr=0.132000, batch loss=0.129687, epoch loss=1.160907 Batch=779, step=16380, lr=0.132000, batch loss=0.290581, epoch loss=1.451488 Batch=839, step=16440, lr=0.131750, batch loss=0.092954, epoch loss=1.544442 Batch=899, step=16500, lr=0.131500, batch loss=0.078063, epoch loss=1.622505 Batch=959, step=16560, lr=0.131250, batch loss=0.034555, epoch loss=1.657061 Batch=1019, step=16620, lr=0.131000, batch loss=0.067412, epoch loss=1.724472 Batch=1079, step=16680, lr=0.130750, batch loss=0.050322, epoch loss=1.774795 Batch=1139, step=16740, lr=0.130250, batch loss=0.087538, epoch loss=1.862332 Batch=1199, step=16800, lr=0.130250, batch loss=0.040316, epoch loss=1.902648 Epoch=13, step=16800, lr=0.130250, epoch loss=1.902648 Batch=59, step=16860, lr=0.130000, batch loss=0.033512, epoch loss=0.033512 Batch=119, step=16920, lr=0.129750, batch loss=0.034192, epoch loss=0.067704 Batch=179, step=16980, lr=0.129500, batch loss=0.044395, epoch loss=0.112099 Batch=239, step=17040, lr=0.129250, batch loss=0.059676, epoch loss=0.171775 Batch=299, step=17100, lr=0.129000, batch loss=0.020703, epoch loss=0.192478 Batch=359, step=17160, lr=0.128750, batch loss=0.043783, epoch loss=0.236261 Batch=419, step=17220, lr=0.128500, batch loss=0.052370, epoch loss=0.288631 Batch=479, step=17280, lr=0.128250, batch loss=0.040823, epoch loss=0.329454 Batch=539, step=17340, lr=0.128000, batch loss=0.101985, epoch loss=0.431440 Batch=599, step=17400, lr=0.127750, batch loss=0.041520, epoch loss=0.472959 Batch=659, step=17460, lr=0.127500, batch loss=0.054379, epoch loss=0.527338 Batch=719, step=17520, lr=0.127250, batch loss=0.039514, epoch loss=0.566852 Batch=779, step=17580, lr=0.127000, batch loss=0.045182, epoch loss=0.612033 Batch=839, step=17640, lr=0.126750, batch loss=0.078695, epoch loss=0.690729 Batch=899, step=17700, lr=0.126500, batch loss=0.053143, epoch loss=0.743872 Batch=959, step=17760, lr=0.126250, batch loss=0.018587, epoch loss=0.762458 Batch=1019, step=17820, lr=0.126000, batch loss=0.025876, epoch loss=0.788334 Batch=1079, step=17880, lr=0.125750, batch loss=0.026034, epoch loss=0.814369 Batch=1139, step=17940, lr=0.125500, batch loss=0.074706, epoch loss=0.889074 Batch=1199, step=18000, lr=0.125250, batch loss=0.019694, epoch loss=0.908769 Epoch=14, step=18000, lr=0.125250, epoch loss=0.908769 Batch=59, step=18060, lr=0.125000, batch loss=0.015702, epoch loss=0.015702 Batch=119, step=18120, lr=0.124750, batch loss=0.022651, epoch loss=0.038354 Batch=179, step=18180, lr=0.124500, batch loss=0.030577, epoch loss=0.068931 Batch=239, step=18240, lr=0.124250, batch loss=0.036310, epoch loss=0.105241 Batch=299, step=18300, lr=0.124000, batch loss=0.009176, epoch loss=0.114417 Batch=359, step=18360, lr=0.123750, batch loss=0.023930, epoch loss=0.138348 Batch=419, step=18420, lr=0.123500, batch loss=0.029480, epoch loss=0.167827 Batch=479, step=18480, lr=0.123250, batch loss=0.021520, epoch loss=0.189347 Batch=539, step=18540, lr=0.123000, batch loss=0.041404, epoch loss=0.230751 Batch=599, step=18600, lr=0.122750, batch loss=0.026340, epoch loss=0.257091 Batch=659, step=18660, lr=0.122500, batch loss=0.033474, epoch loss=0.290565 Batch=719, step=18720, lr=0.122250, batch loss=0.038586, epoch loss=0.329151 Batch=779, step=18780, lr=0.122000, batch loss=0.130767, epoch loss=0.459917 Batch=839, step=18840, lr=0.121500, batch loss=0.053530, epoch loss=0.513447 Batch=899, step=18900, lr=0.121500, batch loss=0.051428, epoch loss=0.564875 Batch=959, step=18960, lr=0.121250, batch loss=0.016323, epoch loss=0.581198 Batch=1019, step=19020, lr=0.121000, batch loss=0.028107, epoch loss=0.609305 Batch=1079, step=19080, lr=0.120750, batch loss=0.011504, epoch loss=0.620809 Batch=1139, step=19140, lr=0.120500, batch loss=0.024262, epoch loss=0.645071 Batch=1199, step=19200, lr=0.120250, batch loss=0.009944, epoch loss=0.655015 Epoch=15, step=19200, lr=0.120250, epoch loss=0.655015 Batch=59, step=19260, lr=0.120000, batch loss=0.005038, epoch loss=0.005038 Batch=119, step=19320, lr=0.119750, batch loss=0.021789, epoch loss=0.026827 Batch=179, step=19380, lr=0.119500, batch loss=0.050848, epoch loss=0.077675 Batch=239, step=19440, lr=0.119250, batch loss=0.023090, epoch loss=0.100766 Batch=299, step=19500, lr=0.119000, batch loss=0.020290, epoch loss=0.121055 Batch=359, step=19560, lr=0.118750, batch loss=0.032711, epoch loss=0.153767 Batch=419, step=19620, lr=0.118500, batch loss=0.020411, epoch loss=0.174178 Batch=479, step=19680, lr=0.118250, batch loss=0.007260, epoch loss=0.181438 Batch=539, step=19740, lr=0.118000, batch loss=0.019249, epoch loss=0.200687 Batch=599, step=19800, lr=0.117750, batch loss=0.024256, epoch loss=0.224943 Batch=659, step=19860, lr=0.117500, batch loss=0.021747, epoch loss=0.246690 Batch=719, step=19920, lr=0.117250, batch loss=0.051449, epoch loss=0.298139 Batch=779, step=19980, lr=0.117000, batch loss=0.087284, epoch loss=0.385423 Batch=839, step=20040, lr=0.116750, batch loss=0.030719, epoch loss=0.416141 Batch=899, step=20100, lr=0.116500, batch loss=0.033755, epoch loss=0.449897 Batch=959, step=20160, lr=0.116250, batch loss=0.011117, epoch loss=0.461013 Batch=1019, step=20220, lr=0.116000, batch loss=0.014934, epoch loss=0.475948 Batch=1079, step=20280, lr=0.115750, batch loss=0.002846, epoch loss=0.478794 Batch=1139, step=20340, lr=0.115500, batch loss=0.015224, epoch loss=0.494018 Batch=1199, step=20400, lr=0.115250, batch loss=0.005626, epoch loss=0.499644 Epoch=16, step=20400, lr=0.115250, epoch loss=0.499644 Batch=59, step=20460, lr=0.115000, batch loss=0.003119, epoch loss=0.003119 Batch=119, step=20520, lr=0.114750, batch loss=0.009803, epoch loss=0.012922 Batch=179, step=20580, lr=0.114500, batch loss=0.023192, epoch loss=0.036113 Batch=239, step=20640, lr=0.114250, batch loss=0.015974, epoch loss=0.052087 Batch=299, step=20700, lr=0.114000, batch loss=0.004751, epoch loss=0.056838 Batch=359, step=20760, lr=0.113750, batch loss=0.013991, epoch loss=0.070829 Batch=419, step=20820, lr=0.113250, batch loss=0.014338, epoch loss=0.085167 Batch=479, step=20880, lr=0.113250, batch loss=0.004894, epoch loss=0.090061 Batch=539, step=20940, lr=0.113000, batch loss=0.015808, epoch loss=0.105870 Batch=599, step=21000, lr=0.112750, batch loss=0.018582, epoch loss=0.124451 Batch=659, step=21060, lr=0.112500, batch loss=0.015251, epoch loss=0.139702 Batch=719, step=21120, lr=0.112250, batch loss=0.043951, epoch loss=0.183653 Batch=779, step=21180, lr=0.112000, batch loss=0.072934, epoch loss=0.256586 Batch=839, step=21240, lr=0.111750, batch loss=0.026200, epoch loss=0.282786 Batch=899, step=21300, lr=0.111250, batch loss=0.028161, epoch loss=0.310947 Batch=959, step=21360, lr=0.111250, batch loss=0.009970, epoch loss=0.320917 Batch=1019, step=21420, lr=0.111000, batch loss=0.011867, epoch loss=0.332784 Batch=1079, step=21480, lr=0.110750, batch loss=0.000756, epoch loss=0.333540 Batch=1139, step=21540, lr=0.110500, batch loss=0.013183, epoch loss=0.346722 Batch=1199, step=21600, lr=0.110250, batch loss=0.005181, epoch loss=0.351903 Epoch=17, step=21600, lr=0.110250, epoch loss=0.351903 Batch=59, step=21660, lr=0.110000, batch loss=0.002624, epoch loss=0.002624 Batch=119, step=21720, lr=0.109750, batch loss=0.006573, epoch loss=0.009197 Batch=179, step=21780, lr=0.109500, batch loss=0.013056, epoch loss=0.022253 Batch=239, step=21840, lr=0.109250, batch loss=0.009833, epoch loss=0.032086 Batch=299, step=21900, lr=0.109000, batch loss=0.001449, epoch loss=0.033535 Batch=359, step=21960, lr=0.108750, batch loss=0.011554, epoch loss=0.045089 Batch=419, step=22020, lr=0.108500, batch loss=0.012653, epoch loss=0.057742 Batch=479, step=22080, lr=0.108250, batch loss=0.004604, epoch loss=0.062346 Batch=539, step=22140, lr=0.108000, batch loss=0.015456, epoch loss=0.077802 Batch=599, step=22200, lr=0.107750, batch loss=0.015898, epoch loss=0.093700 Batch=659, step=22260, lr=0.107500, batch loss=0.013527, epoch loss=0.107227 Batch=719, step=22320, lr=0.107250, batch loss=0.024273, epoch loss=0.131500 Batch=779, step=22380, lr=0.107000, batch loss=0.038351, epoch loss=0.169851 Batch=839, step=22440, lr=0.106750, batch loss=0.021582, epoch loss=0.191432 Batch=899, step=22500, lr=0.106500, batch loss=0.022645, epoch loss=0.214077 Batch=959, step=22560, lr=0.106250, batch loss=0.012211, epoch loss=0.226288 Batch=1019, step=22620, lr=0.106000, batch loss=0.011879, epoch loss=0.238167 Batch=1079, step=22680, lr=0.105750, batch loss=0.000940, epoch loss=0.239108 Batch=1139, step=22740, lr=0.105500, batch loss=0.011446, epoch loss=0.250553 Batch=1199, step=22800, lr=0.105250, batch loss=0.004530, epoch loss=0.255084 Epoch=18, step=22800, lr=0.105250, epoch loss=0.255084 Batch=59, step=22860, lr=0.105000, batch loss=0.001546, epoch loss=0.001546 Batch=119, step=22920, lr=0.104750, batch loss=0.005785, epoch loss=0.007331 Batch=179, step=22980, lr=0.104500, batch loss=0.011325, epoch loss=0.018655 Batch=239, step=23040, lr=0.104250, batch loss=0.009345, epoch loss=0.028000 Batch=299, step=23100, lr=0.104000, batch loss=0.008394, epoch loss=0.036394 Batch=359, step=23160, lr=0.103750, batch loss=0.011237, epoch loss=0.047631 Batch=419, step=23220, lr=0.103500, batch loss=0.011193, epoch loss=0.058824 Batch=479, step=23280, lr=0.103250, batch loss=0.004162, epoch loss=0.062986 Batch=539, step=23340, lr=0.103000, batch loss=0.014650, epoch loss=0.077636 Batch=599, step=23400, lr=0.102500, batch loss=0.015010, epoch loss=0.092646 Batch=659, step=23460, lr=0.102500, batch loss=0.014662, epoch loss=0.107308 Batch=719, step=23520, lr=0.102250, batch loss=0.017661, epoch loss=0.124970 Batch=779, step=23580, lr=0.102000, batch loss=0.020065, epoch loss=0.145035 Batch=839, step=23640, lr=0.101750, batch loss=0.025121, epoch loss=0.170156 Batch=899, step=23700, lr=0.101500, batch loss=0.023862, epoch loss=0.194017 Batch=959, step=23760, lr=0.101250, batch loss=0.008003, epoch loss=0.202021 Batch=1019, step=23820, lr=0.100750, batch loss=0.008304, epoch loss=0.210324 Batch=1079, step=23880, lr=0.100750, batch loss=0.001001, epoch loss=0.211325 Batch=1139, step=23940, lr=0.100500, batch loss=0.009557, epoch loss=0.220882 Batch=1199, step=24000, lr=0.100250, batch loss=0.004659, epoch loss=0.225542 Epoch=19, step=24000, lr=0.100250, epoch loss=0.225542 Half-moons scatterplot and decision boundary: ┌────────────────────────────────────────────────────────────────────────────────────────────────────┐ │********************************#*******************************************************************│ │**********************#*#*#######*###*#####*********************************************************│ │**********************#########################*****************************************************│ │*****************#**########*######*###########*###*************************************************│ │***************#################*###################************************************************│ │************######*#################*#################**********************************************│ │**********#*#####*########*#**************##*#########*#********************************************│ │***********########*##*#******************#*****##########******************************************│ │***********###########*************************############**************************************...│ │********######*####*********************************###*###*#*********************************......│ │*******######**##***************.******************#*######*#*******************************........│ │*******##*##**##***********..........***************########*##***************************..........│ │*****#######************.......%...%%...***************#########*************************.........%.│ │******######***********.........%........***************##*#####***********************.......%.%.%.│ │***#########**********.........%%%.%%......*************#*#######*********************.......%.%%%%.│ │****#######**********..........%%%%.........************#########********************........%%.%%.%│ │**#######************..........%%%%%%%.......**************###*###******************.........%%%%%%.│ │*##*####************...........%%%%%%%.........***********########****************...........%%%%%%.│ │*#######************...........%%%%%%%..........************#######**************............%%%%%%.│ │*##*####***********............%%.%%%%%...........***********####***************............%%%%%%%.│ │*#####*#***********.............%%%%%%%............**********##*###************..............%%%%%..│ │#######***********.............%.%%%%%%.............*********#######*********..............%%%%.%%..│ │#####*#**********...............%%%%%%%...............*******#######********...............%%%%%%%%.│ │###*#*#**********...............%%%%%%%%%..............*******######*******................%%%%%%...│ │#######*********.................%%%%%%%%...............*****###*###******................%%%%%%....│ │######**********.................%%%%%%%%%................***#*###*******...............%%%%%%%%%...│ │*#*##*#********...................%%%%%%%%%%...............***######***..................%%%%%%.....│ │#****##********....................%%%%%%%%%.................**###*#**................%.%%%%%%%.....│ │**************.....................%.%%%%%%...................*******..................%.%%.%%......│ │**************.......................%..%%%%%%%................*****..............%.%%%%%%%%%.......│ │*************.........................%.%%%.%%%%.................*................%%%%%%%.%.%.......│ │************............................%..%%%%..%................................%%%%%%%%..........│ │************.............................%%%%%%%%%%%........................%%..%%%%%%%%.%..........│ │***********..............................%%.%%%%%%%%..%....................%..%%%.%%%%%%%...........│ │***********.................................%%%%.%%%%%%%%...............%.%%%%%%%%%%%%.%............│ │**********...................................%%%%%%%%%%%%%%%%%%%%%%.%%%%.%%%%%%%%%%%%%..............│ │**********....................................%%.%%%%%%%%%%%%%%%%%%%%%%.%%%%%%%%%%%.................│ │*********.........................................%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%...................│ │********.............................................%%%.%%%%%%%%%%%%%%%%%%%%%......................│ │********................................................%...%%%%.%%.%%%%..%.........................│ └────────────────────────────────────────────────────────────────────────────────────────────────────┘ "/usr/bin/env" "bash" "-c" "opam exec -- dune build @install @check @runtest && rm -rf _build" failed with exit status 1 2025-05-15 13:55.34: Job failed: Failed: Build failed