2025-03-20 22:11.56: New job: test ahrefs/ocannl https://github.com/ahrefs/ocannl.git#refs/heads/master (4ee46a20839684c520fd8d1cc91b4a5416d1e783) (linux-x86_64:debian-12-5.3+flambda_opam-2.3) Base: ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96 Opam project build To reproduce locally: git clone --recursive "https://github.com/ahrefs/ocannl.git" -b "master" && cd "ocannl" && git reset --hard 4ee46a20 cat > Dockerfile <<'END-OF-DOCKERFILE' FROM ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96 # debian-12-5.3+flambda_opam-2.3 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" WORKDIR /src RUN sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version WORKDIR /src RUN sudo chown opam /src RUN cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 neural_nets_lib.opam arrayjit.opam ./ RUN opam pin add -yn neural_nets_lib.dev './' && \ opam pin add -yn arrayjit.dev './' RUN echo '(lang dune 3.0)' > './dune-project' ENV DEPS="angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /src RUN opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-03-20 22:11.56: Using cache hint "ahrefs/ocannl-ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96-debian-12-5.3+flambda_opam-2.3-14a85f4c565cc30186c137b219fc7fa2" 2025-03-20 22:11.56: Using OBuilder spec: ((from ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96) (comment debian-12-5.3+flambda_opam-2.3) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (workdir /src) (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (workdir /src) (run (shell "sudo chown opam /src")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) (env DEPS "angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /src)) (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-03-20 22:11.56: Waiting for resource in pool OCluster 2025-03-20 22:11.56: Waiting for worker… 2025-03-20 22:15.24: Got resource from pool OCluster Building on asteria.caelum.ci.dev HEAD is now at ccaf459c Missing from previous commit: test update HEAD is now at 4ee46a20 Update GitHub actions (from ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96) 2025-03-20 22:15.25 ---> using "0cceac30ed9dfa8d54c8dfb703526aecc2d1f25e09755ea19f6a9b3ce08944d1" from cache /: (comment debian-12-5.3+flambda_opam-2.3) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (workdir /src) /src: (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) 2025-03-20 22:15.25 ---> using "b6cc72d0b69338afee388438c163da01e1509537d9db800516f8d6f84e0ff0f0" from cache /src: (run (shell "opam init --reinit -ni")) Configuring from /home/opam/.opamrc and then from built-in defaults. Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. This development version of opam requires an update to the layout of /home/opam/.opam from version 2.0 to version 2.2, which can't be reverted. You may want to back it up before going further. Continue? [y/n] y [NOTE] The 'jobs' option was reset, its value was 39 and its new value will vary according to the current number of cores on your machine. You can restore the fixed value using: opam option jobs=39 --global Format upgrade done. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [ERROR] Could not update repository "opam-repository-archive": "/usr/bin/git fetch -q" exited with code 128 "fatal: unable to access 'https://github.com/ocaml/opam-repository-archive/': Could not resolve host: github.com" [default] synchronised from file:///home/opam/opam-repository 2025-03-20 22:15.25 ---> using "9579671be2547253d961834dbf99a2617c3043d50341e16980a14c2b7946d157" from cache /src: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) Linux 5.15.0-134-generic The OCaml toplevel, version 5.3.0 2.3.0 2025-03-20 22:15.25 ---> using "21f2427316ecd3b2b1d06245ccc85b94bbe86b75b44d925374a8e2c678f4916d" from cache /src: (workdir /src) /src: (run (shell "sudo chown opam /src")) 2025-03-20 22:15.25 ---> using "a9c2b183ae9ffb50c46a6bddc4bdbd4bc2a49a6d4fa95d31a0495ab1d51f7cf3" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u")) From https://github.com/ocaml/opam-repository * branch master -> FETCH_HEAD 862a7640b1..6cf83229dd master -> origin/master 4e25d0cf5f Merge pull request #27651 from lukstafi/opam-publish-ppx_minidebug.2.1.0 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [opam-repository-archive] synchronised from git+https://github.com/ocaml/opam-repository-archive [default] synchronised from file:///home/opam/opam-repository Everything as up-to-date as possible (run with --verbose to show unavailable upgrades). However, you may "opam upgrade" these packages explicitly, which will ask permission to downgrade or uninstall the conflicting packages. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-03-20 22:15.25 ---> using "af53f33f5b819debc22b733209c9b4785d8a42ec3a79f8a616d660e6188b4b5a" from cache /src: (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) 2025-03-20 22:15.25 ---> using "4d092fe60106e4cfa0fe459c0b169083e09f5dc64b9b316296acefc369daf27c" from cache /src: (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) [neural_nets_lib.dev] synchronised (file:///src) neural_nets_lib is now pinned to file:///src (version dev) [arrayjit.dev] synchronised (file:///src) arrayjit is now pinned to file:///src (version dev) 2025-03-20 22:15.25 ---> using "ed41bc75e2dba58dad74574c9ffea1421b3c12ed3ca43c7e83f3886b17c40c33" from cache /src: (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) 2025-03-20 22:15.25 ---> using "e17b79b6b616db5a76c5f8ba7b9f742d2848885854a4f2f4f81546ed5d06406a" from cache /src: (env DEPS "angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") /src: (env CI true) /src: (env OCAMLCI true) /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) + /usr/bin/sudo "apt-get" "update" - Get:1 http://deb.debian.org/debian bookworm InRelease [151 kB] - Get:2 http://deb.debian.org/debian bookworm-updates InRelease [55.4 kB] - Get:3 http://deb.debian.org/debian-security bookworm-security InRelease [48.0 kB] - Get:4 http://deb.debian.org/debian bookworm/main amd64 Packages [8792 kB] - Get:5 http://deb.debian.org/debian-security bookworm-security/main amd64 Packages [249 kB] - Fetched 9296 kB in 1s (7159 kB/s) - Reading package lists... <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [arrayjit.dev] synchronised (file:///src) [neural_nets_lib.dev] synchronised (file:///src) [NOTE] Package ocaml-variants is already installed (current version is 5.3.0+options). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following system packages will first need to be installed: libffi-dev pkg-config <><> Handling external dependencies <><><><><><><><><><><><><><><><><><><><><><> + /usr/bin/sudo "apt-get" "install" "-qq" "-yy" "libffi-dev" "pkg-config" - debconf: delaying package configuration, since apt-utils is not installed - Selecting previously unselected package libffi-dev:amd64. - (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 18776 files and directories currently installed.) - Preparing to unpack .../libffi-dev_3.4.4-1_amd64.deb ... - Unpacking libffi-dev:amd64 (3.4.4-1) ... - Selecting previously unselected package libpkgconf3:amd64. - Preparing to unpack .../libpkgconf3_1.8.1-1_amd64.deb ... - Unpacking libpkgconf3:amd64 (1.8.1-1) ... - Selecting previously unselected package pkgconf-bin. - Preparing to unpack .../pkgconf-bin_1.8.1-1_amd64.deb ... - Unpacking pkgconf-bin (1.8.1-1) ... - Selecting previously unselected package pkgconf:amd64. - Preparing to unpack .../pkgconf_1.8.1-1_amd64.deb ... - Unpacking pkgconf:amd64 (1.8.1-1) ... - Selecting previously unselected package pkg-config:amd64. - Preparing to unpack .../pkg-config_1.8.1-1_amd64.deb ... - Unpacking pkg-config:amd64 (1.8.1-1) ... - Setting up libffi-dev:amd64 (3.4.4-1) ... - Setting up libpkgconf3:amd64 (1.8.1-1) ... - Setting up pkgconf-bin (1.8.1-1) ... - Setting up pkgconf:amd64 (1.8.1-1) ... - Setting up pkg-config:amd64 (1.8.1-1) ... - Processing triggers for libc-bin (2.36-9+deb12u9) ... 2025-03-20 22:15.25 ---> using "b2525f254fe76ee0fff7852a88218c32446de8a68e191c41be20656ee0652909" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) [NOTE] Package ocaml-variants is already installed (current version is 5.3.0+options). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following actions will be performed: === install 65 packages - install angstrom 0.16.1 - install backoff 0.1.1 - install base v0.17.1 - install bigarray-compat 1.1.0 - install bigstringaf 0.10.0 - install conf-libffi 2.0.0 - install conf-pkg-config 4 - install cppo 1.8.0 - install csexp 1.5.2 - install ctypes 0.23.0 - install ctypes-foreign 0.23.0 - install dune 3.17.2 - install dune-configurator 3.17.2 - install fieldslib v0.17.0 - install integers 0.7.0 - install jane-street-headers v0.17.0 - install jst-config v0.17.0 - install mtime 2.1.0 - install multicore-magic 2.3.1 - install num 1.5-1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.16.1 - install ocamlfind 1.9.8 - install parsexp v0.17.0 - install ppx_assert v0.17.0 - install ppx_base v0.17.0 - install ppx_cold v0.17.0 - install ppx_compare v0.17.0 - install ppx_derivers 1.2.1 - install ppx_deriving 6.0.3 - install ppx_enumerate v0.17.0 - install ppx_expect v0.17.2 - install ppx_fields_conv v0.17.0 - install ppx_globalize v0.17.0 - install ppx_hash v0.17.0 - install ppx_here v0.17.0 - install ppx_inline_test v0.17.0 - install ppx_minidebug 2.1.0 - install ppx_optcomp v0.17.0 - install ppx_sexp_conv v0.17.0 - install ppx_string v0.17.0 - install ppx_variants_conv v0.17.0 - install ppxlib 0.35.0 - install ppxlib_jane v0.17.2 - install printbox 0.12 - install printbox-ext-plot 0.12 - install printbox-html 0.12 - install printbox-md 0.12 - install printbox-text 0.12 - install ptime 1.2.0 - install re 1.12.0 - install saturn_lockfree 0.5.0 - install seq base - install sexplib v0.17.0 - install sexplib0 v0.17.0 - install stdio v0.17.0 - install stdlib-shims 0.3.0 - install time_now v0.17.0 - install topkg 1.0.8 - install tyxml 4.6.0 - install uucp 16.0.0 - install uutf 1.0.4 - install variantslib v0.17.0 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved backoff.0.1.1 (cached) -> retrieved bigarray-compat.1.1.0 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved base.v0.17.1 (cached) -> retrieved cppo.1.8.0 (cached) -> installed conf-pkg-config.4 -> retrieved csexp.1.5.2 (cached) -> retrieved ctypes.0.23.0, ctypes-foreign.0.23.0 (cached) -> installed conf-libffi.2.0.0 -> retrieved fieldslib.v0.17.0 (cached) -> retrieved integers.0.7.0 (cached) -> retrieved jane-street-headers.v0.17.0 (cached) -> retrieved jst-config.v0.17.0 (cached) -> retrieved mtime.2.1.0 (cached) -> retrieved multicore-magic.2.3.1 (cached) -> retrieved num.1.5-1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ocamlbuild.0.16.1 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved parsexp.v0.17.0 (cached) -> retrieved ppx_assert.v0.17.0 (cached) -> retrieved ppx_base.v0.17.0 (cached) -> retrieved ppx_cold.v0.17.0 (cached) -> retrieved ppx_compare.v0.17.0 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved ppx_enumerate.v0.17.0 (cached) -> retrieved dune.3.17.2, dune-configurator.3.17.2 (cached) -> installed num.1.5-1 -> retrieved ppx_deriving.6.0.3 (cached) -> retrieved ppx_expect.v0.17.2 (cached) -> retrieved ppx_fields_conv.v0.17.0 (cached) -> retrieved ppx_globalize.v0.17.0 (cached) -> retrieved ppx_hash.v0.17.0 (cached) -> retrieved ppx_here.v0.17.0 (cached) -> retrieved ppx_inline_test.v0.17.0 (cached) -> retrieved ppx_optcomp.v0.17.0 (cached) -> retrieved ppx_sexp_conv.v0.17.0 (cached) -> retrieved ppx_string.v0.17.0 (cached) -> retrieved ppx_variants_conv.v0.17.0 (cached) -> retrieved ppx_minidebug.2.1.0 (cached) -> retrieved ppxlib_jane.v0.17.2 (cached) -> retrieved ptime.1.2.0 (cached) -> retrieved re.1.12.0 (cached) -> retrieved ppxlib.0.35.0 (cached) -> retrieved seq.base (cached) -> installed seq.base -> retrieved printbox.0.12, printbox-ext-plot.0.12, printbox-html.0.12, printbox-md.0.12, printbox-text.0.12 (cached) -> retrieved saturn_lockfree.0.5.0 (cached) -> retrieved sexplib.v0.17.0 (cached) -> retrieved sexplib0.v0.17.0 (cached) -> retrieved stdio.v0.17.0 (cached) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved time_now.v0.17.0 (cached) -> retrieved topkg.1.0.8 (cached) -> retrieved tyxml.4.6.0 (cached) -> retrieved uutf.1.0.4 (cached) -> retrieved variantslib.v0.17.0 (cached) -> retrieved uucp.16.0.0 (cached) -> installed ocamlfind.1.9.8 -> installed ocamlbuild.0.16.1 -> installed topkg.1.0.8 -> installed uutf.1.0.4 -> installed mtime.2.1.0 -> installed ptime.1.2.0 -> installed dune.3.17.2 -> installed jane-street-headers.v0.17.0 -> installed ppx_derivers.1.2.1 -> installed backoff.0.1.1 -> installed printbox.0.12 -> installed bigarray-compat.1.1.0 -> installed csexp.1.5.2 -> installed multicore-magic.2.3.1 -> installed ocaml-syntax-shims.1.0.0 -> installed cppo.1.8.0 -> installed ocaml-compiler-libs.v0.17.0 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed re.1.12.0 -> installed sexplib0.v0.17.0 -> installed stdlib-shims.0.3.0 -> installed saturn_lockfree.0.5.0 -> installed integers.0.7.0 -> installed parsexp.v0.17.0 -> installed dune-configurator.3.17.2 -> installed bigstringaf.0.10.0 -> installed sexplib.v0.17.0 -> installed angstrom.0.16.1 -> installed tyxml.4.6.0 -> installed uucp.16.0.0 -> installed printbox-html.0.12 -> installed printbox-text.0.12 -> installed printbox-md.0.12 -> installed printbox-ext-plot.0.12 -> installed ctypes.0.23.0 -> installed base.v0.17.1 -> installed ctypes-foreign.0.23.0 -> installed variantslib.v0.17.0 -> installed fieldslib.v0.17.0 -> installed stdio.v0.17.0 -> installed ppxlib.0.35.0 -> installed ppx_optcomp.v0.17.0 -> installed ppxlib_jane.v0.17.2 -> installed ppx_cold.v0.17.0 -> installed ppx_here.v0.17.0 -> installed ppx_variants_conv.v0.17.0 -> installed ppx_fields_conv.v0.17.0 -> installed ppx_enumerate.v0.17.0 -> installed ppx_globalize.v0.17.0 -> installed ppx_deriving.6.0.3 -> installed ppx_compare.v0.17.0 -> installed ppx_sexp_conv.v0.17.0 -> installed ppx_hash.v0.17.0 -> installed ppx_assert.v0.17.0 -> installed ppx_minidebug.2.1.0 -> installed ppx_base.v0.17.0 -> installed jst-config.v0.17.0 -> installed ppx_string.v0.17.0 -> installed time_now.v0.17.0 -> installed ppx_inline_test.v0.17.0 -> installed ppx_expect.v0.17.2 Done. # To update the current shell environment, run: eval $(opam env) 2025-03-20 22:15.25 ---> using "fe9fe964598e38313b9199dad45340f8d22606e7deb7f9769e2b671f80f8063a" from cache /src: (copy (src .) (dst /src)) 2025-03-20 22:15.25 ---> saved as "d2c479c3673829352eeccef88b484f654b5bdf88a5aa064b28cc1073b8019d73" /src: (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test_ppx && ./test_ppx_op.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test_ppx && ./test_ppx_op_expected.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition '' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/8b77c20becd28cf1da8fc920632dba9c/default/test/ocannl_config.' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Retrieving commandline, environment, or config file variable ocannl_log_level' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Found 0, in the config file' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition primitive_ops.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition zero2hero_1of7.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition hello_world_op.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition einsum_trivia.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition micrograd_demo.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition moons_demo_parallel.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/ca33e61146172de7418e67b74bbdff30/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test && ./moons_demo_parallel_run.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file ("Set log_level to" 1) └─{orphaned from #2} Retrieving commandline, environment, or config file variable ocannl_backend Found cc, in the config file Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Retrieving commandline, environment, or config file variable ocannl_never_capture_stdout Not found, using default false Batch=59, step=60, lr=0.200000, batch loss=23.609453, epoch loss=23.609453 Batch=119, step=120, lr=0.199750, batch loss=8.539634, epoch loss=32.149087 Batch=179, step=180, lr=0.199500, batch loss=2.626295, epoch loss=34.775382 Batch=239, step=240, lr=0.199250, batch loss=0.849657, epoch loss=35.625039 Batch=299, step=300, lr=0.199000, batch loss=1.447177, epoch loss=37.072216 Batch=359, step=360, lr=0.198750, batch loss=1.329296, epoch loss=38.401512 Batch=419, step=420, lr=0.198500, batch loss=0.618569, epoch loss=39.020081 Batch=479, step=480, lr=0.198250, batch loss=0.822060, epoch loss=39.842141 Batch=539, step=540, lr=0.198000, batch loss=0.690244, epoch loss=40.532385 Batch=599, step=600, lr=0.197750, batch loss=1.063878, epoch loss=41.596263 Batch=659, step=660, lr=0.197500, batch loss=0.483340, epoch loss=42.079603 Batch=719, step=720, lr=0.197250, batch loss=0.411299, epoch loss=42.490902 Batch=779, step=780, lr=0.197000, batch loss=0.470123, epoch loss=42.961024 Batch=839, step=840, lr=0.196750, batch loss=0.446661, epoch loss=43.407685 Batch=899, step=900, lr=0.196500, batch loss=0.382721, epoch loss=43.790407 Batch=959, step=960, lr=0.196250, batch loss=0.245136, epoch loss=44.035543 Batch=1019, step=1020, lr=0.196000, batch loss=0.466506, epoch loss=44.502049 Batch=1079, step=1080, lr=0.195750, batch loss=0.248781, epoch loss=44.750829 Batch=1139, step=1140, lr=0.195500, batch loss=0.317440, epoch loss=45.068269 Batch=1199, step=1200, lr=0.195250, batch loss=0.263683, epoch loss=45.331952 Epoch=0, step=1200, lr=0.195250, epoch loss=45.331952 Batch=59, step=1260, lr=0.195000, batch loss=0.262138, epoch loss=0.262138 Batch=119, step=1320, lr=0.194750, batch loss=0.205243, epoch loss=0.467381 Batch=179, step=1380, lr=0.194500, batch loss=0.243644, epoch loss=0.711025 Batch=239, step=1440, lr=0.194250, batch loss=0.347897, epoch loss=1.058921 Batch=299, step=1500, lr=0.194000, batch loss=0.247348, epoch loss=1.306269 Batch=359, step=1560, lr=0.193500, batch loss=0.316559, epoch loss=1.622828 Batch=419, step=1620, lr=0.193500, batch loss=0.312705, epoch loss=1.935533 Batch=479, step=1680, lr=0.193250, batch loss=0.276293, epoch loss=2.211827 Batch=539, step=1740, lr=0.193000, batch loss=0.209805, epoch loss=2.421632 Batch=599, step=1800, lr=0.192750, batch loss=0.250440, epoch loss=2.672071 Batch=659, step=1860, lr=0.192500, batch loss=0.367161, epoch loss=3.039232 Batch=719, step=1920, lr=0.192250, batch loss=0.354963, epoch loss=3.394195 Batch=779, step=1980, lr=0.192000, batch loss=0.381346, epoch loss=3.775542 Batch=839, step=2040, lr=0.191750, batch loss=0.339645, epoch loss=4.115187 Batch=899, step=2100, lr=0.191500, batch loss=0.295230, epoch loss=4.410417 Batch=959, step=2160, lr=0.191250, batch loss=0.214068, epoch loss=4.624485 Batch=1019, step=2220, lr=0.191000, batch loss=0.330895, epoch loss=4.955380 Batch=1079, step=2280, lr=0.190750, batch loss=0.208336, epoch loss=5.163715 Batch=1139, step=2340, lr=0.190500, batch loss=0.278287, epoch loss=5.442003 Batch=1199, step=2400, lr=0.190250, batch loss=0.220827, epoch loss=5.662829 Epoch=1, step=2400, lr=0.190250, epoch loss=5.662829 Batch=59, step=2460, lr=0.190000, batch loss=0.230401, epoch loss=0.230401 Batch=119, step=2520, lr=0.189750, batch loss=0.195923, epoch loss=0.426324 Batch=179, step=2580, lr=0.189500, batch loss=0.221171, epoch loss=0.647496 Batch=239, step=2640, lr=0.189250, batch loss=0.328096, epoch loss=0.975592 Batch=299, step=2700, lr=0.189000, batch loss=0.202920, epoch loss=1.178511 Batch=359, step=2760, lr=0.188750, batch loss=0.288277, epoch loss=1.466789 Batch=419, step=2820, lr=0.188500, batch loss=0.280169, epoch loss=1.746958 Batch=479, step=2880, lr=0.188250, batch loss=0.251462, epoch loss=1.998420 Batch=539, step=2940, lr=0.188000, batch loss=0.191274, epoch loss=2.189693 Batch=599, step=3000, lr=0.187750, batch loss=0.224972, epoch loss=2.414665 Batch=659, step=3060, lr=0.187500, batch loss=0.335764, epoch loss=2.750430 Batch=719, step=3120, lr=0.187250, batch loss=0.331419, epoch loss=3.081849 Batch=779, step=3180, lr=0.187000, batch loss=0.357915, epoch loss=3.439763 Batch=839, step=3240, lr=0.186750, batch loss=0.325619, epoch loss=3.765382 Batch=899, step=3300, lr=0.186500, batch loss=0.292356, epoch loss=4.057738 Batch=959, step=3360, lr=0.186000, batch loss=0.244976, epoch loss=4.302713 Batch=1019, step=3420, lr=0.186000, batch loss=0.361592, epoch loss=4.664305 Batch=1079, step=3480, lr=0.185750, batch loss=0.222362, epoch loss=4.886668 Batch=1139, step=3540, lr=0.185500, batch loss=0.263846, epoch loss=5.150514 Batch=1199, step=3600, lr=0.185250, batch loss=0.199983, epoch loss=5.350497 Epoch=2, step=3600, lr=0.185250, epoch loss=5.350497 Batch=59, step=3660, lr=0.185000, batch loss=0.221305, epoch loss=0.221305 Batch=119, step=3720, lr=0.184750, batch loss=0.184891, epoch loss=0.406196 Batch=179, step=3780, lr=0.184500, batch loss=0.211367, epoch loss=0.617563 Batch=239, step=3840, lr=0.184250, batch loss=0.317033, epoch loss=0.934596 Batch=299, step=3900, lr=0.184000, batch loss=0.209046, epoch loss=1.143642 Batch=359, step=3960, lr=0.183750, batch loss=0.285781, epoch loss=1.429423 Batch=419, step=4020, lr=0.183500, batch loss=0.278128, epoch loss=1.707551 Batch=479, step=4080, lr=0.183250, batch loss=0.255095, epoch loss=1.962646 Batch=539, step=4140, lr=0.183000, batch loss=0.199782, epoch loss=2.162428 Batch=599, step=4200, lr=0.182750, batch loss=0.242813, epoch loss=2.405241 Batch=659, step=4260, lr=0.182500, batch loss=0.327659, epoch loss=2.732901 Batch=719, step=4320, lr=0.182250, batch loss=0.330986, epoch loss=3.063887 Batch=779, step=4380, lr=0.182000, batch loss=0.349608, epoch loss=3.413495 Batch=839, step=4440, lr=0.181750, batch loss=0.317971, epoch loss=3.731466 Batch=899, step=4500, lr=0.181500, batch loss=0.285573, epoch loss=4.017039 Batch=959, step=4560, lr=0.181250, batch loss=0.239964, epoch loss=4.257003 Batch=1019, step=4620, lr=0.181000, batch loss=0.332659, epoch loss=4.589662 Batch=1079, step=4680, lr=0.180750, batch loss=0.196027, epoch loss=4.785689 Batch=1139, step=4740, lr=0.180500, batch loss=0.228950, epoch loss=5.014639 Batch=1199, step=4800, lr=0.180250, batch loss=0.193246, epoch loss=5.207885 Epoch=3, step=4800, lr=0.180250, epoch loss=5.207885 Batch=59, step=4860, lr=0.180000, batch loss=0.234383, epoch loss=0.234383 Batch=119, step=4920, lr=0.179750, batch loss=0.191891, epoch loss=0.426274 Batch=179, step=4980, lr=0.179500, batch loss=0.205456, epoch loss=0.631729 Batch=239, step=5040, lr=0.179250, batch loss=0.306251, epoch loss=0.937980 Batch=299, step=5100, lr=0.179000, batch loss=0.201331, epoch loss=1.139311 Batch=359, step=5160, lr=0.178750, batch loss=0.273721, epoch loss=1.413032 Batch=419, step=5220, lr=0.178500, batch loss=0.264076, epoch loss=1.677109 Batch=479, step=5280, lr=0.178250, batch loss=0.240434, epoch loss=1.917542 Batch=539, step=5340, lr=0.178000, batch loss=0.190972, epoch loss=2.108515 Batch=599, step=5400, lr=0.177750, batch loss=0.230533, epoch loss=2.339048 Batch=659, step=5460, lr=0.177500, batch loss=0.324543, epoch loss=2.663591 Batch=719, step=5520, lr=0.177250, batch loss=0.331829, epoch loss=2.995420 Batch=779, step=5580, lr=0.177000, batch loss=0.342035, epoch loss=3.337455 Batch=839, step=5640, lr=0.176750, batch loss=0.308977, epoch loss=3.646432 Batch=899, step=5700, lr=0.176500, batch loss=0.272327, epoch loss=3.918758 Batch=959, step=5760, lr=0.176250, batch loss=0.215641, epoch loss=4.134400 Batch=1019, step=5820, lr=0.176000, batch loss=0.337342, epoch loss=4.471742 Batch=1079, step=5880, lr=0.175750, batch loss=0.192055, epoch loss=4.663797 Batch=1139, step=5940, lr=0.175500, batch loss=0.221364, epoch loss=4.885160 Batch=1199, step=6000, lr=0.175250, batch loss=0.186290, epoch loss=5.071451 Epoch=4, step=6000, lr=0.175250, epoch loss=5.071451 Batch=59, step=6060, lr=0.175000, batch loss=0.223195, epoch loss=0.223195 Batch=119, step=6120, lr=0.174750, batch loss=0.186231, epoch loss=0.409426 Batch=179, step=6180, lr=0.174500, batch loss=0.199960, epoch loss=0.609385 Batch=239, step=6240, lr=0.174250, batch loss=0.299093, epoch loss=0.908478 Batch=299, step=6300, lr=0.174000, batch loss=0.215091, epoch loss=1.123569 Batch=359, step=6360, lr=0.173750, batch loss=0.279225, epoch loss=1.402795 Batch=419, step=6420, lr=0.173500, batch loss=0.275009, epoch loss=1.677803 Batch=479, step=6480, lr=0.173250, batch loss=0.251711, epoch loss=1.929514 Batch=539, step=6540, lr=0.173000, batch loss=0.188266, epoch loss=2.117781 Batch=599, step=6600, lr=0.172750, batch loss=0.229697, epoch loss=2.347478 Batch=659, step=6660, lr=0.172500, batch loss=0.312402, epoch loss=2.659881 Batch=719, step=6720, lr=0.172250, batch loss=0.316361, epoch loss=2.976242 Batch=779, step=6780, lr=0.172000, batch loss=0.327419, epoch loss=3.303660 Batch=839, step=6840, lr=0.171750, batch loss=0.298750, epoch loss=3.602410 Batch=899, step=6900, lr=0.171500, batch loss=0.260596, epoch loss=3.863007 Batch=959, step=6960, lr=0.171250, batch loss=0.190333, epoch loss=4.053340 Batch=1019, step=7020, lr=0.171000, batch loss=0.313040, epoch loss=4.366380 Batch=1079, step=7080, lr=0.170750, batch loss=0.186810, epoch loss=4.553190 Batch=1139, step=7140, lr=0.170500, batch loss=0.218472, epoch loss=4.771662 Batch=1199, step=7200, lr=0.170250, batch loss=0.180867, epoch loss=4.952529 Epoch=5, step=7200, lr=0.170250, epoch loss=4.952529 Batch=59, step=7260, lr=0.170000, batch loss=0.234753, epoch loss=0.234753 Batch=119, step=7320, lr=0.169750, batch loss=0.183073, epoch loss=0.417826 Batch=179, step=7380, lr=0.169500, batch loss=0.195721, epoch loss=0.613547 Batch=239, step=7440, lr=0.169250, batch loss=0.289866, epoch loss=0.903412 Batch=299, step=7500, lr=0.169000, batch loss=0.199178, epoch loss=1.102590 Batch=359, step=7560, lr=0.168750, batch loss=0.260198, epoch loss=1.362789 Batch=419, step=7620, lr=0.168500, batch loss=0.255089, epoch loss=1.617878 Batch=479, step=7680, lr=0.168250, batch loss=0.237264, epoch loss=1.855142 Batch=539, step=7740, lr=0.168000, batch loss=0.187166, epoch loss=2.042308 Batch=599, step=7800, lr=0.167750, batch loss=0.226259, epoch loss=2.268567 Batch=659, step=7860, lr=0.167500, batch loss=0.304779, epoch loss=2.573346 Batch=719, step=7920, lr=0.167250, batch loss=0.306773, epoch loss=2.880118 Batch=779, step=7980, lr=0.167000, batch loss=0.320322, epoch loss=3.200440 Batch=839, step=8040, lr=0.166750, batch loss=0.296006, epoch loss=3.496446 Batch=899, step=8100, lr=0.166500, batch loss=0.258324, epoch loss=3.754771 Batch=959, step=8160, lr=0.166250, batch loss=0.208261, epoch loss=3.963032 Batch=1019, step=8220, lr=0.166000, batch loss=0.320541, epoch loss=4.283573 Batch=1079, step=8280, lr=0.165750, batch loss=0.178404, epoch loss=4.461977 Batch=1139, step=8340, lr=0.165500, batch loss=0.211944, epoch loss=4.673920 Batch=1199, step=8400, lr=0.165250, batch loss=0.172715, epoch loss=4.846636 Epoch=6, step=8400, lr=0.165250, epoch loss=4.846636 Batch=59, step=8460, lr=0.165000, batch loss=0.199335, epoch loss=0.199335 Batch=119, step=8520, lr=0.164750, batch loss=0.173500, epoch loss=0.372835 Batch=179, step=8580, lr=0.164500, batch loss=0.187188, epoch loss=0.560023 Batch=239, step=8640, lr=0.164250, batch loss=0.276508, epoch loss=0.836531 Batch=299, step=8700, lr=0.164000, batch loss=0.192706, epoch loss=1.029237 Batch=359, step=8760, lr=0.163750, batch loss=0.248910, epoch loss=1.278147 Batch=419, step=8820, lr=0.163500, batch loss=0.242722, epoch loss=1.520869 Batch=479, step=8880, lr=0.163250, batch loss=0.229639, epoch loss=1.750508 Batch=539, step=8940, lr=0.163000, batch loss=0.175872, epoch loss=1.926381 Batch=599, step=9000, lr=0.162750, batch loss=0.219579, epoch loss=2.145960 Batch=659, step=9060, lr=0.162500, batch loss=0.292415, epoch loss=2.438375 Batch=719, step=9120, lr=0.162250, batch loss=0.295732, epoch loss=2.734107 Batch=779, step=9180, lr=0.162000, batch loss=0.312829, epoch loss=3.046936 Batch=839, step=9240, lr=0.161750, batch loss=0.281109, epoch loss=3.328045 Batch=899, step=9300, lr=0.161500, batch loss=0.252070, epoch loss=3.580115 Batch=959, step=9360, lr=0.161250, batch loss=0.187646, epoch loss=3.767761 Batch=1019, step=9420, lr=0.161000, batch loss=0.320139, epoch loss=4.087900 Batch=1079, step=9480, lr=0.160750, batch loss=0.198018, epoch loss=4.285918 Batch=1139, step=9540, lr=0.160500, batch loss=0.212058, epoch loss=4.497975 Batch=1199, step=9600, lr=0.160250, batch loss=0.167149, epoch loss=4.665124 Epoch=7, step=9600, lr=0.160250, epoch loss=4.665124 Batch=59, step=9660, lr=0.160000, batch loss=0.201059, epoch loss=0.201059 Batch=119, step=9720, lr=0.159750, batch loss=0.163620, epoch loss=0.364679 Batch=179, step=9780, lr=0.159500, batch loss=0.177847, epoch loss=0.542525 Batch=239, step=9840, lr=0.159250, batch loss=0.261717, epoch loss=0.804242 Batch=299, step=9900, lr=0.159000, batch loss=0.181803, epoch loss=0.986045 Batch=359, step=9960, lr=0.158750, batch loss=0.239727, epoch loss=1.225772 Batch=419, step=10020, lr=0.158500, batch loss=0.231406, epoch loss=1.457178 Batch=479, step=10080, lr=0.158250, batch loss=0.213247, epoch loss=1.670425 Batch=539, step=10140, lr=0.158000, batch loss=0.170210, epoch loss=1.840635 Batch=599, step=10200, lr=0.157750, batch loss=0.201913, epoch loss=2.042548 Batch=659, step=10260, lr=0.157500, batch loss=0.281054, epoch loss=2.323603 Batch=719, step=10320, lr=0.157250, batch loss=0.285309, epoch loss=2.608912 Batch=779, step=10380, lr=0.157000, batch loss=0.293461, epoch loss=2.902372 Batch=839, step=10440, lr=0.156750, batch loss=0.266538, epoch loss=3.168910 Batch=899, step=10500, lr=0.156500, batch loss=0.241159, epoch loss=3.410069 Batch=959, step=10560, lr=0.156250, batch loss=0.197703, epoch loss=3.607772 Batch=1019, step=10620, lr=0.156000, batch loss=0.279915, epoch loss=3.887687 Batch=1079, step=10680, lr=0.155750, batch loss=0.167133, epoch loss=4.054820 Batch=1139, step=10740, lr=0.155500, batch loss=0.193571, epoch loss=4.248391 Batch=1199, step=10800, lr=0.155250, batch loss=0.153300, epoch loss=4.401691 Epoch=8, step=10800, lr=0.155250, epoch loss=4.401691 Batch=59, step=10860, lr=0.155000, batch loss=0.183081, epoch loss=0.183081 Batch=119, step=10920, lr=0.154750, batch loss=0.153590, epoch loss=0.336670 Batch=179, step=10980, lr=0.154500, batch loss=0.166310, epoch loss=0.502981 Batch=239, step=11040, lr=0.154250, batch loss=0.242899, epoch loss=0.745880 Batch=299, step=11100, lr=0.154000, batch loss=0.165121, epoch loss=0.911001 Batch=359, step=11160, lr=0.153750, batch loss=0.225487, epoch loss=1.136488 Batch=419, step=11220, lr=0.153500, batch loss=0.226346, epoch loss=1.362833 Batch=479, step=11280, lr=0.153250, batch loss=0.202404, epoch loss=1.565237 Batch=539, step=11340, lr=0.153000, batch loss=0.158304, epoch loss=1.723542 Batch=599, step=11400, lr=0.152750, batch loss=0.179657, epoch loss=1.903199 Batch=659, step=11460, lr=0.152500, batch loss=0.262556, epoch loss=2.165755 Batch=719, step=11520, lr=0.152250, batch loss=0.253584, epoch loss=2.419339 Batch=779, step=11580, lr=0.152000, batch loss=0.268916, epoch loss=2.688255 Batch=839, step=11640, lr=0.151750, batch loss=0.256066, epoch loss=2.944321 Batch=899, step=11700, lr=0.151500, batch loss=0.214766, epoch loss=3.159087 Batch=959, step=11760, lr=0.151250, batch loss=0.162243, epoch loss=3.321329 Batch=1019, step=11820, lr=0.151000, batch loss=0.257410, epoch loss=3.578739 Batch=1079, step=11880, lr=0.150750, batch loss=0.141494, epoch loss=3.720233 Batch=1139, step=11940, lr=0.150500, batch loss=0.178931, epoch loss=3.899164 Batch=1199, step=12000, lr=0.150250, batch loss=0.138714, epoch loss=4.037877 Epoch=9, step=12000, lr=0.150250, epoch loss=4.037877 Batch=59, step=12060, lr=0.150000, batch loss=0.160776, epoch loss=0.160776 Batch=119, step=12120, lr=0.149750, batch loss=0.135850, epoch loss=0.296625 Batch=179, step=12180, lr=0.149500, batch loss=0.149706, epoch loss=0.446331 Batch=239, step=12240, lr=0.149250, batch loss=0.217230, epoch loss=0.663561 Batch=299, step=12300, lr=0.149000, batch loss=0.141400, epoch loss=0.804961 Batch=359, step=12360, lr=0.148750, batch loss=0.195702, epoch loss=1.000663 Batch=419, step=12420, lr=0.148500, batch loss=0.205125, epoch loss=1.205788 Batch=479, step=12480, lr=0.148250, batch loss=0.177439, epoch loss=1.383227 Batch=539, step=12540, lr=0.148000, batch loss=0.141576, epoch loss=1.524803 Batch=599, step=12600, lr=0.147750, batch loss=0.147562, epoch loss=1.672365 Batch=659, step=12660, lr=0.147500, batch loss=0.225841, epoch loss=1.898206 Batch=719, step=12720, lr=0.147250, batch loss=0.235109, epoch loss=2.133315 Batch=779, step=12780, lr=0.147000, batch loss=0.263977, epoch loss=2.397292 Batch=839, step=12840, lr=0.146750, batch loss=0.232582, epoch loss=2.629874 Batch=899, step=12900, lr=0.146500, batch loss=0.211141, epoch loss=2.841015 Batch=959, step=12960, lr=0.146250, batch loss=0.150916, epoch loss=2.991932 Batch=1019, step=13020, lr=0.146000, batch loss=0.264289, epoch loss=3.256221 Batch=1079, step=13080, lr=0.145750, batch loss=0.111444, epoch loss=3.367665 Batch=1139, step=13140, lr=0.145500, batch loss=0.151671, epoch loss=3.519336 Batch=1199, step=13200, lr=0.145250, batch loss=0.115668, epoch loss=3.635004 Epoch=10, step=13200, lr=0.145250, epoch loss=3.635004 Batch=59, step=13260, lr=0.145000, batch loss=0.143059, epoch loss=0.143059 Batch=119, step=13320, lr=0.144750, batch loss=0.125466, epoch loss=0.268525 Batch=179, step=13380, lr=0.144500, batch loss=0.128428, epoch loss=0.396953 Batch=239, step=13440, lr=0.144250, batch loss=0.187115, epoch loss=0.584068 Batch=299, step=13500, lr=0.144000, batch loss=0.120796, epoch loss=0.704865 Batch=359, step=13560, lr=0.143750, batch loss=0.161001, epoch loss=0.865865 Batch=419, step=13620, lr=0.143500, batch loss=0.160141, epoch loss=1.026006 Batch=479, step=13680, lr=0.143250, batch loss=0.145020, epoch loss=1.171026 Batch=539, step=13740, lr=0.143000, batch loss=0.117532, epoch loss=1.288558 Batch=599, step=13800, lr=0.142750, batch loss=0.118328, epoch loss=1.406886 Batch=659, step=13860, lr=0.142500, batch loss=0.173160, epoch loss=1.580046 Batch=719, step=13920, lr=0.142250, batch loss=0.175190, epoch loss=1.755236 Batch=779, step=13980, lr=0.142000, batch loss=0.194289, epoch loss=1.949525 Batch=839, step=14040, lr=0.141750, batch loss=0.182973, epoch loss=2.132498 Batch=899, step=14100, lr=0.141500, batch loss=0.155621, epoch loss=2.288119 Batch=959, step=14160, lr=0.141250, batch loss=0.131922, epoch loss=2.420041 Batch=1019, step=14220, lr=0.141000, batch loss=0.273954, epoch loss=2.693995 Batch=1079, step=14280, lr=0.140750, batch loss=0.080318, epoch loss=2.774313 Batch=1139, step=14340, lr=0.140500, batch loss=0.124583, epoch loss=2.898896 Batch=1199, step=14400, lr=0.140250, batch loss=0.087555, epoch loss=2.986451 Epoch=11, step=14400, lr=0.140250, epoch loss=2.986451 Batch=59, step=14460, lr=0.140000, batch loss=0.108956, epoch loss=0.108956 Batch=119, step=14520, lr=0.139750, batch loss=0.107415, epoch loss=0.216370 Batch=179, step=14580, lr=0.139500, batch loss=0.102845, epoch loss=0.319215 Batch=239, step=14640, lr=0.139250, batch loss=0.137486, epoch loss=0.456701 Batch=299, step=14700, lr=0.139000, batch loss=0.076081, epoch loss=0.532782 Batch=359, step=14760, lr=0.138750, batch loss=0.115961, epoch loss=0.648743 Batch=419, step=14820, lr=0.138500, batch loss=0.125297, epoch loss=0.774040 Batch=479, step=14880, lr=0.138250, batch loss=0.100667, epoch loss=0.874707 Batch=539, step=14940, lr=0.138000, batch loss=0.114941, epoch loss=0.989648 Batch=599, step=15000, lr=0.137750, batch loss=0.086978, epoch loss=1.076626 Batch=659, step=15060, lr=0.137500, batch loss=0.123511, epoch loss=1.200137 Batch=719, step=15120, lr=0.137250, batch loss=0.127418, epoch loss=1.327555 Batch=779, step=15180, lr=0.137000, batch loss=0.148578, epoch loss=1.476133 Batch=839, step=15240, lr=0.136750, batch loss=0.147807, epoch loss=1.623940 Batch=899, step=15300, lr=0.136500, batch loss=0.177022, epoch loss=1.800961 Batch=959, step=15360, lr=0.136250, batch loss=0.088287, epoch loss=1.889248 Batch=1019, step=15420, lr=0.136000, batch loss=0.167109, epoch loss=2.056357 Batch=1079, step=15480, lr=0.135750, batch loss=0.039899, epoch loss=2.096256 Batch=1139, step=15540, lr=0.135500, batch loss=0.096714, epoch loss=2.192971 Batch=1199, step=15600, lr=0.135250, batch loss=0.056452, epoch loss=2.249422 Epoch=12, step=15600, lr=0.135250, epoch loss=2.249422 Batch=59, step=15660, lr=0.135000, batch loss=0.073272, epoch loss=0.073272 Batch=119, step=15720, lr=0.134750, batch loss=0.128584, epoch loss=0.201855 Batch=179, step=15780, lr=0.134500, batch loss=0.092506, epoch loss=0.294362 Batch=239, step=15840, lr=0.134250, batch loss=0.091002, epoch loss=0.385363 Batch=299, step=15900, lr=0.134000, batch loss=0.035866, epoch loss=0.421229 Batch=359, step=15960, lr=0.133750, batch loss=0.080610, epoch loss=0.501840 Batch=419, step=16020, lr=0.133500, batch loss=0.075518, epoch loss=0.577358 Batch=479, step=16080, lr=0.133250, batch loss=0.060216, epoch loss=0.637574 Batch=539, step=16140, lr=0.133000, batch loss=0.065016, epoch loss=0.702590 Batch=599, step=16200, lr=0.132750, batch loss=0.158302, epoch loss=0.860891 Batch=659, step=16260, lr=0.132500, batch loss=0.089386, epoch loss=0.950277 Batch=719, step=16320, lr=0.132250, batch loss=0.118343, epoch loss=1.068621 Batch=779, step=16380, lr=0.132000, batch loss=0.295095, epoch loss=1.363715 Batch=839, step=16440, lr=0.131750, batch loss=0.090499, epoch loss=1.454214 Batch=899, step=16500, lr=0.131500, batch loss=0.078894, epoch loss=1.533109 Batch=959, step=16560, lr=0.131250, batch loss=0.030617, epoch loss=1.563726 Batch=1019, step=16620, lr=0.131000, batch loss=0.051736, epoch loss=1.615461 Batch=1079, step=16680, lr=0.130750, batch loss=0.048837, epoch loss=1.664299 Batch=1139, step=16740, lr=0.130500, batch loss=0.096597, epoch loss=1.760896 Batch=1199, step=16800, lr=0.130250, batch loss=0.046877, epoch loss=1.807772 Epoch=13, step=16800, lr=0.130250, epoch loss=1.807772 Batch=59, step=16860, lr=0.130000, batch loss=0.033440, epoch loss=0.033440 Batch=119, step=16920, lr=0.129750, batch loss=0.033280, epoch loss=0.066720 Batch=179, step=16980, lr=0.129500, batch loss=0.041670, epoch loss=0.108390 Batch=239, step=17040, lr=0.129250, batch loss=0.056656, epoch loss=0.165046 Batch=299, step=17100, lr=0.129000, batch loss=0.019246, epoch loss=0.184292 Batch=359, step=17160, lr=0.128750, batch loss=0.041844, epoch loss=0.226136 Batch=419, step=17220, lr=0.128500, batch loss=0.063085, epoch loss=0.289221 Batch=479, step=17280, lr=0.128250, batch loss=0.023038, epoch loss=0.312258 Batch=539, step=17340, lr=0.128000, batch loss=0.027046, epoch loss=0.339304 Batch=599, step=17400, lr=0.127750, batch loss=0.033575, epoch loss=0.372879 Batch=659, step=17460, lr=0.127500, batch loss=0.045717, epoch loss=0.418596 Batch=719, step=17520, lr=0.127250, batch loss=0.043224, epoch loss=0.461820 Batch=779, step=17580, lr=0.127000, batch loss=0.081219, epoch loss=0.543039 Batch=839, step=17640, lr=0.126750, batch loss=0.206573, epoch loss=0.749613 Batch=899, step=17700, lr=0.126500, batch loss=0.051812, epoch loss=0.801425 Batch=959, step=17760, lr=0.126250, batch loss=0.020573, epoch loss=0.821998 Batch=1019, step=17820, lr=0.126000, batch loss=0.038327, epoch loss=0.860324 Batch=1079, step=17880, lr=0.125750, batch loss=0.016653, epoch loss=0.876977 Batch=1139, step=17940, lr=0.125500, batch loss=0.044474, epoch loss=0.921451 Batch=1199, step=18000, lr=0.125250, batch loss=0.018301, epoch loss=0.939752 Epoch=14, step=18000, lr=0.125250, epoch loss=0.939752 Batch=59, step=18060, lr=0.125000, batch loss=0.012044, epoch loss=0.012044 Batch=119, step=18120, lr=0.124750, batch loss=0.017354, epoch loss=0.029399 Batch=179, step=18180, lr=0.124500, batch loss=0.028169, epoch loss=0.057568 Batch=239, step=18240, lr=0.124250, batch loss=0.031502, epoch loss=0.089069 Batch=299, step=18300, lr=0.124000, batch loss=0.009327, epoch loss=0.098396 Batch=359, step=18360, lr=0.123750, batch loss=0.021542, epoch loss=0.119938 Batch=419, step=18420, lr=0.123500, batch loss=0.034255, epoch loss=0.154193 Batch=479, step=18480, lr=0.123250, batch loss=0.024419, epoch loss=0.178612 Batch=539, step=18540, lr=0.123000, batch loss=0.052277, epoch loss=0.230889 Batch=599, step=18600, lr=0.122750, batch loss=0.026197, epoch loss=0.257086 Batch=659, step=18660, lr=0.122500, batch loss=0.030169, epoch loss=0.287255 Batch=719, step=18720, lr=0.122250, batch loss=0.031047, epoch loss=0.318302 Batch=779, step=18780, lr=0.122000, batch loss=0.077149, epoch loss=0.395451 Batch=839, step=18840, lr=0.121750, batch loss=0.044356, epoch loss=0.439807 Batch=899, step=18900, lr=0.121500, batch loss=0.044875, epoch loss=0.484683 Batch=959, step=18960, lr=0.121250, batch loss=0.014425, epoch loss=0.499108 Batch=1019, step=19020, lr=0.121000, batch loss=0.019370, epoch loss=0.518477 Batch=1079, step=19080, lr=0.120750, batch loss=0.014772, epoch loss=0.533250 Batch=1139, step=19140, lr=0.120500, batch loss=0.029890, epoch loss=0.563139 Batch=1199, step=19200, lr=0.120250, batch loss=0.010997, epoch loss=0.574136 Epoch=15, step=19200, lr=0.120250, epoch loss=0.574136 Batch=59, step=19260, lr=0.120000, batch loss=0.008802, epoch loss=0.008802 Batch=119, step=19320, lr=0.119750, batch loss=0.032939, epoch loss=0.041741 Batch=179, step=19380, lr=0.119500, batch loss=0.083478, epoch loss=0.125219 Batch=239, step=19440, lr=0.119250, batch loss=0.032766, epoch loss=0.157985 Batch=299, step=19500, lr=0.119000, batch loss=0.009794, epoch loss=0.167779 Batch=359, step=19560, lr=0.118750, batch loss=0.023195, epoch loss=0.190974 Batch=419, step=19620, lr=0.118500, batch loss=0.020014, epoch loss=0.210988 Batch=479, step=19680, lr=0.118250, batch loss=0.007962, epoch loss=0.218950 Batch=539, step=19740, lr=0.118000, batch loss=0.017936, epoch loss=0.236886 Batch=599, step=19800, lr=0.117750, batch loss=0.022438, epoch loss=0.259324 Batch=659, step=19860, lr=0.117500, batch loss=0.016398, epoch loss=0.275722 Batch=719, step=19920, lr=0.117250, batch loss=0.036580, epoch loss=0.312302 Batch=779, step=19980, lr=0.117000, batch loss=0.079411, epoch loss=0.391712 Batch=839, step=20040, lr=0.116750, batch loss=0.030269, epoch loss=0.421982 Batch=899, step=20100, lr=0.116500, batch loss=0.030012, epoch loss=0.451994 Batch=959, step=20160, lr=0.116250, batch loss=0.017711, epoch loss=0.469705 Batch=1019, step=20220, lr=0.116000, batch loss=0.023266, epoch loss=0.492971 Batch=1079, step=20280, lr=0.115750, batch loss=0.001765, epoch loss=0.494736 Batch=1139, step=20340, lr=0.115500, batch loss=0.015292, epoch loss=0.510028 Batch=1199, step=20400, lr=0.115250, batch loss=0.006157, epoch loss=0.516185 Epoch=16, step=20400, lr=0.115250, epoch loss=0.516185 Batch=59, step=20460, lr=0.115000, batch loss=0.002842, epoch loss=0.002842 Batch=119, step=20520, lr=0.114750, batch loss=0.009795, epoch loss=0.012637 Batch=179, step=20580, lr=0.114500, batch loss=0.020499, epoch loss=0.033136 Batch=239, step=20640, lr=0.114250, batch loss=0.018351, epoch loss=0.051487 Batch=299, step=20700, lr=0.114000, batch loss=0.007135, epoch loss=0.058623 Batch=359, step=20760, lr=0.113750, batch loss=0.013137, epoch loss=0.071759 Batch=419, step=20820, lr=0.113500, batch loss=0.014700, epoch loss=0.086459 Batch=479, step=20880, lr=0.113250, batch loss=0.003980, epoch loss=0.090439 Batch=539, step=20940, lr=0.113000, batch loss=0.015462, epoch loss=0.105901 Batch=599, step=21000, lr=0.112750, batch loss=0.018216, epoch loss=0.124117 Batch=659, step=21060, lr=0.112500, batch loss=0.014782, epoch loss=0.138899 Batch=719, step=21120, lr=0.112250, batch loss=0.040756, epoch loss=0.179656 Batch=779, step=21180, lr=0.112000, batch loss=0.071966, epoch loss=0.251621 Batch=839, step=21240, lr=0.111750, batch loss=0.025803, epoch loss=0.277424 Batch=899, step=21300, lr=0.111500, batch loss=0.037029, epoch loss=0.314453 Batch=959, step=21360, lr=0.111250, batch loss=0.009695, epoch loss=0.324148 Batch=1019, step=21420, lr=0.111000, batch loss=0.011175, epoch loss=0.335323 Batch=1079, step=21480, lr=0.110750, batch loss=0.000597, epoch loss=0.335920 Batch=1139, step=21540, lr=0.110500, batch loss=0.012843, epoch loss=0.348762 Batch=1199, step=21600, lr=0.110250, batch loss=0.005294, epoch loss=0.354056 Epoch=17, step=21600, lr=0.110250, epoch loss=0.354056 Batch=59, step=21660, lr=0.110000, batch loss=0.001803, epoch loss=0.001803 Batch=119, step=21720, lr=0.109750, batch loss=0.006112, epoch loss=0.007915 Batch=179, step=21780, lr=0.109500, batch loss=0.012494, epoch loss=0.020409 Batch=239, step=21840, lr=0.109250, batch loss=0.011187, epoch loss=0.031596 Batch=299, step=21900, lr=0.109000, batch loss=0.012591, epoch loss=0.044187 Batch=359, step=21960, lr=0.108750, batch loss=0.012611, epoch loss=0.056797 Batch=419, step=22020, lr=0.108500, batch loss=0.012020, epoch loss=0.068818 Batch=479, step=22080, lr=0.108250, batch loss=0.002386, epoch loss=0.071204 Batch=539, step=22140, lr=0.108000, batch loss=0.016474, epoch loss=0.087678 Batch=599, step=22200, lr=0.107750, batch loss=0.016720, epoch loss=0.104398 Batch=659, step=22260, lr=0.107500, batch loss=0.016331, epoch loss=0.120730 Batch=719, step=22320, lr=0.107250, batch loss=0.021497, epoch loss=0.142227 Batch=779, step=22380, lr=0.107000, batch loss=0.028173, epoch loss=0.170399 Batch=839, step=22440, lr=0.106750, batch loss=0.031005, epoch loss=0.201404 Batch=899, step=22500, lr=0.106500, batch loss=0.023342, epoch loss=0.224746 Batch=959, step=22560, lr=0.106250, batch loss=0.010214, epoch loss=0.234960 Batch=1019, step=22620, lr=0.106000, batch loss=0.008396, epoch loss=0.243356 Batch=1079, step=22680, lr=0.105750, batch loss=0.001312, epoch loss=0.244667 Batch=1139, step=22740, lr=0.105500, batch loss=0.010079, epoch loss=0.254746 Batch=1199, step=22800, lr=0.105250, batch loss=0.005517, epoch loss=0.260262 Epoch=18, step=22800, lr=0.105250, epoch loss=0.260262 Batch=59, step=22860, lr=0.105000, batch loss=0.002392, epoch loss=0.002392 Batch=119, step=22920, lr=0.104750, batch loss=0.004621, epoch loss=0.007013 Batch=179, step=22980, lr=0.104500, batch loss=0.012409, epoch loss=0.019422 Batch=239, step=23040, lr=0.104250, batch loss=0.008299, epoch loss=0.027722 Batch=299, step=23100, lr=0.104000, batch loss=0.008509, epoch loss=0.036230 Batch=359, step=23160, lr=0.103750, batch loss=0.012252, epoch loss=0.048482 Batch=419, step=23220, lr=0.103500, batch loss=0.011082, epoch loss=0.059565 Batch=479, step=23280, lr=0.103250, batch loss=0.003135, epoch loss=0.062699 Batch=539, step=23340, lr=0.103000, batch loss=0.016263, epoch loss=0.078963 Batch=599, step=23400, lr=0.102750, batch loss=0.013623, epoch loss=0.092586 Batch=659, step=23460, lr=0.102500, batch loss=0.011248, epoch loss=0.103834 Batch=719, step=23520, lr=0.102250, batch loss=0.014383, epoch loss=0.118218 Batch=779, step=23580, lr=0.102000, batch loss=0.022126, epoch loss=0.140343 Batch=839, step=23640, lr=0.101750, batch loss=0.024345, epoch loss=0.164688 Batch=899, step=23700, lr=0.101500, batch loss=0.023512, epoch loss=0.188200 Batch=959, step=23760, lr=0.101250, batch loss=0.007760, epoch loss=0.195960 Batch=1019, step=23820, lr=0.101000, batch loss=0.007713, epoch loss=0.203673 Batch=1079, step=23880, lr=0.100750, batch loss=0.000824, epoch loss=0.204497 Batch=1139, step=23940, lr=0.100500, batch loss=0.008452, epoch loss=0.212949 Batch=1199, step=24000, lr=0.100250, batch loss=0.004382, epoch loss=0.217331 Epoch=19, step=24000, lr=0.100250, epoch loss=0.217331 Half-moons scatterplot and decision boundary: ┌────────────────────────────────────────────────────────────────────────────────────────────────────┐ │********************************#*******************************************************************│ │**********************#*#*#######*###*#####*********************************************************│ │**********************#########################*****************************************************│ │*****************#**########*######*###########*###*************************************************│ │***************#################*###################************************************************│ │************######*#################*#################**********************************************│ │**********#*#####*########*#**************##*#########*#********************************************│ │***********########*##*#******************#*****##########******************************************│ │***********###########*************************############***************************************..│ │********######*####*********************************###*###*#**********************************.....│ │*******######**##*************...******************#*######*#********************************.......│ │*******##*##**##**********...........***************########*##***************************..........│ │*****#######************.......%...%%...***************#########*************************.........%.│ │******######**********..........%........***************##*#####************************......%.%.%.│ │***#########**********.........%%%.%%......*************#*#######*********************.......%.%%%%.│ │****#######**********..........%%%%.........************#########********************........%%.%%.%│ │**#######************..........%%%%%%%.......**************###*###******************.........%%%%%%.│ │*##*####************...........%%%%%%%.........***********########*****************..........%%%%%%.│ │*#######************...........%%%%%%%..........************#######**************............%%%%%%.│ │*##*####***********............%%.%%%%%...........***********####***************............%%%%%%%.│ │*#####*#**********..............%%%%%%%............**********##*###************..............%%%%%..│ │#######***********.............%.%%%%%%.............*********#######*********..............%%%%.%%..│ │#####*#**********...............%%%%%%%...............*******#######********...............%%%%%%%%.│ │###*#*#**********...............%%%%%%%%%..............*******######*******................%%%%%%...│ │#######*********.................%%%%%%%%...............*****###*###******................%%%%%%....│ │######**********.................%%%%%%%%%................***#*###******................%%%%%%%%%...│ │*#*##*#********...................%%%%%%%%%%...............***######***..................%%%%%%.....│ │#****##********....................%%%%%%%%%.................**###*#**................%.%%%%%%%.....│ │**************.....................%.%%%%%%...................*******..................%.%%.%%......│ │*************........................%..%%%%%%%................****...............%.%%%%%%%%%.......│ │*************.........................%.%%%.%%%%.................*................%%%%%%%.%.%.......│ │************............................%..%%%%..%................................%%%%%%%%..........│ │************.............................%%%%%%%%%%%........................%%..%%%%%%%%.%..........│ │***********..............................%%.%%%%%%%%..%....................%..%%%.%%%%%%%...........│ │***********.................................%%%%.%%%%%%%%...............%.%%%%%%%%%%%%.%............│ │**********...................................%%%%%%%%%%%%%%%%%%%%%%.%%%%.%%%%%%%%%%%%%..............│ │*********.....................................%%.%%%%%%%%%%%%%%%%%%%%%%.%%%%%%%%%%%.................│ │*********.........................................%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%...................│ │********.............................................%%%.%%%%%%%%%%%%%%%%%%%%%......................│ │********................................................%...%%%%.%%.%%%%..%.........................│ └────────────────────────────────────────────────────────────────────────────────────────────────────┘ 2025-03-20 22:15.45 ---> saved as "d35effa352846eb5ba819eb4037f03127244589da08545aa8ab46d49efca1ae1" Job succeeded 2025-03-20 22:15.46: Job succeeded