2025-03-21 15:22.29: New job: test ahrefs/ocannl https://github.com/ahrefs/ocannl.git#refs/heads/master (f6ea3750d181d26e3fd23df51a415f362cff8525) (linux-x86_64:debian-12-5.3+flambda_opam-2.3) Base: ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96 Opam project build To reproduce locally: git clone --recursive "https://github.com/ahrefs/ocannl.git" -b "master" && cd "ocannl" && git reset --hard f6ea3750 cat > Dockerfile <<'END-OF-DOCKERFILE' FROM ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96 # debian-12-5.3+flambda_opam-2.3 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" WORKDIR /src RUN sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version WORKDIR /src RUN sudo chown opam /src RUN cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 neural_nets_lib.opam arrayjit.opam ./ RUN opam pin add -yn neural_nets_lib.dev './' && \ opam pin add -yn arrayjit.dev './' RUN echo '(lang dune 3.0)' > './dune-project' ENV DEPS="angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /src RUN opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-03-21 15:22.29: Using cache hint "ahrefs/ocannl-ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96-debian-12-5.3+flambda_opam-2.3-14a85f4c565cc30186c137b219fc7fa2" 2025-03-21 15:22.29: Using OBuilder spec: ((from ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96) (comment debian-12-5.3+flambda_opam-2.3) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (workdir /src) (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (workdir /src) (run (shell "sudo chown opam /src")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) (env DEPS "angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /src)) (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-03-21 15:22.29: Waiting for resource in pool OCluster 2025-03-21 15:22.29: Waiting for worker… 2025-03-21 15:22.29: Got resource from pool OCluster Building on asteria.caelum.ci.dev HEAD is now at 8b6a6fac Fix bug in grad formula for recip, update tests HEAD is now at f6ea3750 Untested: revert the Cmpne primitive op: can be used to test for NaN (x <> x ==> x = NaN) (from ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96) 2025-03-21 15:22.32 ---> using "0cceac30ed9dfa8d54c8dfb703526aecc2d1f25e09755ea19f6a9b3ce08944d1" from cache /: (comment debian-12-5.3+flambda_opam-2.3) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (workdir /src) /src: (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) 2025-03-21 15:22.32 ---> using "b6cc72d0b69338afee388438c163da01e1509537d9db800516f8d6f84e0ff0f0" from cache /src: (run (shell "opam init --reinit -ni")) Configuring from /home/opam/.opamrc and then from built-in defaults. Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. This development version of opam requires an update to the layout of /home/opam/.opam from version 2.0 to version 2.2, which can't be reverted. You may want to back it up before going further. Continue? [y/n] y [NOTE] The 'jobs' option was reset, its value was 39 and its new value will vary according to the current number of cores on your machine. You can restore the fixed value using: opam option jobs=39 --global Format upgrade done. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [ERROR] Could not update repository "opam-repository-archive": "/usr/bin/git fetch -q" exited with code 128 "fatal: unable to access 'https://github.com/ocaml/opam-repository-archive/': Could not resolve host: github.com" [default] synchronised from file:///home/opam/opam-repository 2025-03-21 15:22.32 ---> using "9579671be2547253d961834dbf99a2617c3043d50341e16980a14c2b7946d157" from cache /src: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) Linux 5.15.0-134-generic The OCaml toplevel, version 5.3.0 2.3.0 2025-03-21 15:22.32 ---> using "21f2427316ecd3b2b1d06245ccc85b94bbe86b75b44d925374a8e2c678f4916d" from cache /src: (workdir /src) /src: (run (shell "sudo chown opam /src")) 2025-03-21 15:22.32 ---> using "a9c2b183ae9ffb50c46a6bddc4bdbd4bc2a49a6d4fa95d31a0495ab1d51f7cf3" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u")) From https://github.com/ocaml/opam-repository * branch master -> FETCH_HEAD 862a7640b1..acfb0e6e94 master -> origin/master 4e25d0cf5f Merge pull request #27651 from lukstafi/opam-publish-ppx_minidebug.2.1.0 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [opam-repository-archive] synchronised from git+https://github.com/ocaml/opam-repository-archive [default] synchronised from file:///home/opam/opam-repository Everything as up-to-date as possible (run with --verbose to show unavailable upgrades). However, you may "opam upgrade" these packages explicitly, which will ask permission to downgrade or uninstall the conflicting packages. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-03-21 15:22.32 ---> using "af53f33f5b819debc22b733209c9b4785d8a42ec3a79f8a616d660e6188b4b5a" from cache /src: (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) 2025-03-21 15:22.32 ---> using "b323d96ffeacb25bfe6adb5928a31ac9ec899d21475a20216b0092dd5e664a7b" from cache /src: (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) [neural_nets_lib.dev] synchronised (file:///src) neural_nets_lib is now pinned to file:///src (version dev) [arrayjit.dev] synchronised (file:///src) arrayjit is now pinned to file:///src (version dev) 2025-03-21 15:22.32 ---> using "5f054a69009203df3f20573d4db5ccf09ae607dbf7085fcf65509d3b4f430b0c" from cache /src: (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) 2025-03-21 15:22.32 ---> using "bb5d4cd51cb16649a0b1a07c13f42656728e057e546d3328dcdcc17c1e452017" from cache /src: (env DEPS "angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") /src: (env CI true) /src: (env OCAMLCI true) /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) + /usr/bin/sudo "apt-get" "update" - Get:1 http://deb.debian.org/debian bookworm InRelease [151 kB] - Get:2 http://deb.debian.org/debian bookworm-updates InRelease [55.4 kB] - Get:3 http://deb.debian.org/debian-security bookworm-security InRelease [48.0 kB] - Get:4 http://deb.debian.org/debian bookworm/main amd64 Packages [8792 kB] - Get:5 http://deb.debian.org/debian-security bookworm-security/main amd64 Packages [249 kB] - Fetched 9296 kB in 3s (3298 kB/s) - Reading package lists... - <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [arrayjit.dev] synchronised (file:///src) [neural_nets_lib.dev] synchronised (file:///src) [NOTE] Package ocaml-variants is already installed (current version is 5.3.0+options). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following system packages will first need to be installed: libffi-dev pkg-config <><> Handling external dependencies <><><><><><><><><><><><><><><><><><><><><><> + /usr/bin/sudo "apt-get" "install" "-qq" "-yy" "libffi-dev" "pkg-config" - debconf: delaying package configuration, since apt-utils is not installed - Selecting previously unselected package libffi-dev:amd64. - (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 18776 files and directories currently installed.) - Preparing to unpack .../libffi-dev_3.4.4-1_amd64.deb ... - Unpacking libffi-dev:amd64 (3.4.4-1) ... - Selecting previously unselected package libpkgconf3:amd64. - Preparing to unpack .../libpkgconf3_1.8.1-1_amd64.deb ... - Unpacking libpkgconf3:amd64 (1.8.1-1) ... - Selecting previously unselected package pkgconf-bin. - Preparing to unpack .../pkgconf-bin_1.8.1-1_amd64.deb ... - Unpacking pkgconf-bin (1.8.1-1) ... - Selecting previously unselected package pkgconf:amd64. - Preparing to unpack .../pkgconf_1.8.1-1_amd64.deb ... - Unpacking pkgconf:amd64 (1.8.1-1) ... - Selecting previously unselected package pkg-config:amd64. - Preparing to unpack .../pkg-config_1.8.1-1_amd64.deb ... - Unpacking pkg-config:amd64 (1.8.1-1) ... - Setting up libffi-dev:amd64 (3.4.4-1) ... - Setting up libpkgconf3:amd64 (1.8.1-1) ... - Setting up pkgconf-bin (1.8.1-1) ... - Setting up pkgconf:amd64 (1.8.1-1) ... - Setting up pkg-config:amd64 (1.8.1-1) ... - Processing triggers for libc-bin (2.36-9+deb12u9) ... 2025-03-21 15:22.32 ---> using "8238c45edf3cd1d100506951c4dcace83467710698b325f4e5379b732f1032f5" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) [NOTE] Package ocaml-variants is already installed (current version is 5.3.0+options). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following actions will be performed: === install 65 packages - install angstrom 0.16.1 - install backoff 0.1.1 - install base v0.17.1 - install bigarray-compat 1.1.0 - install bigstringaf 0.10.0 - install conf-libffi 2.0.0 - install conf-pkg-config 4 - install cppo 1.8.0 - install csexp 1.5.2 - install ctypes 0.23.0 - install ctypes-foreign 0.23.0 - install dune 3.17.2 - install dune-configurator 3.17.2 - install fieldslib v0.17.0 - install integers 0.7.0 - install jane-street-headers v0.17.0 - install jst-config v0.17.0 - install mtime 2.1.0 - install multicore-magic 2.3.1 - install num 1.5-1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.16.1 - install ocamlfind 1.9.8 - install parsexp v0.17.0 - install ppx_assert v0.17.0 - install ppx_base v0.17.0 - install ppx_cold v0.17.0 - install ppx_compare v0.17.0 - install ppx_derivers 1.2.1 - install ppx_deriving 6.0.3 - install ppx_enumerate v0.17.0 - install ppx_expect v0.17.2 - install ppx_fields_conv v0.17.0 - install ppx_globalize v0.17.0 - install ppx_hash v0.17.0 - install ppx_here v0.17.0 - install ppx_inline_test v0.17.0 - install ppx_minidebug 2.1.0 - install ppx_optcomp v0.17.0 - install ppx_sexp_conv v0.17.0 - install ppx_string v0.17.0 - install ppx_variants_conv v0.17.0 - install ppxlib 0.35.0 - install ppxlib_jane v0.17.2 - install printbox 0.12 - install printbox-ext-plot 0.12 - install printbox-html 0.12 - install printbox-md 0.12 - install printbox-text 0.12 - install ptime 1.2.0 - install re 1.12.0 - install saturn_lockfree 0.5.0 - install seq base - install sexplib v0.17.0 - install sexplib0 v0.17.0 - install stdio v0.17.0 - install stdlib-shims 0.3.0 - install time_now v0.17.0 - install topkg 1.0.8 - install tyxml 4.6.0 - install uucp 16.0.0 - install uutf 1.0.4 - install variantslib v0.17.0 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved backoff.0.1.1 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved bigarray-compat.1.1.0 (cached) -> retrieved base.v0.17.1 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved cppo.1.8.0 (cached) -> installed conf-pkg-config.4 -> retrieved csexp.1.5.2 (cached) -> retrieved ctypes.0.23.0, ctypes-foreign.0.23.0 (cached) -> installed conf-libffi.2.0.0 -> retrieved fieldslib.v0.17.0 (cached) -> retrieved integers.0.7.0 (cached) -> retrieved jane-street-headers.v0.17.0 (cached) -> retrieved jst-config.v0.17.0 (cached) -> retrieved mtime.2.1.0 (cached) -> retrieved multicore-magic.2.3.1 (cached) -> retrieved num.1.5-1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ocamlbuild.0.16.1 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved parsexp.v0.17.0 (cached) -> retrieved ppx_assert.v0.17.0 (cached) -> retrieved ppx_base.v0.17.0 (cached) -> retrieved ppx_cold.v0.17.0 (cached) -> retrieved dune.3.17.2, dune-configurator.3.17.2 (cached) -> installed num.1.5-1 -> retrieved ppx_compare.v0.17.0 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved ppx_deriving.6.0.3 (cached) -> retrieved ppx_enumerate.v0.17.0 (cached) -> retrieved ppx_expect.v0.17.2 (cached) -> retrieved ppx_fields_conv.v0.17.0 (cached) -> retrieved ppx_globalize.v0.17.0 (cached) -> retrieved ppx_hash.v0.17.0 (cached) -> retrieved ppx_here.v0.17.0 (cached) -> retrieved ppx_inline_test.v0.17.0 (cached) -> retrieved ppx_optcomp.v0.17.0 (cached) -> retrieved ppx_sexp_conv.v0.17.0 (cached) -> retrieved ppx_string.v0.17.0 (cached) -> retrieved ppx_variants_conv.v0.17.0 (cached) -> retrieved ppxlib_jane.v0.17.2 (cached) -> retrieved ppx_minidebug.2.1.0 (cached) -> retrieved ptime.1.2.0 (cached) -> retrieved re.1.12.0 (cached) -> retrieved saturn_lockfree.0.5.0 (cached) -> retrieved seq.base (cached) -> installed seq.base -> retrieved ppxlib.0.35.0 (cached) -> retrieved printbox.0.12, printbox-ext-plot.0.12, printbox-html.0.12, printbox-md.0.12, printbox-text.0.12 (cached) -> retrieved sexplib.v0.17.0 (cached) -> retrieved sexplib0.v0.17.0 (cached) -> retrieved stdio.v0.17.0 (cached) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved time_now.v0.17.0 (cached) -> retrieved topkg.1.0.8 (cached) -> retrieved tyxml.4.6.0 (cached) -> retrieved uutf.1.0.4 (cached) -> retrieved variantslib.v0.17.0 (cached) -> retrieved uucp.16.0.0 (cached) -> installed ocamlbuild.0.16.1 -> installed ocamlfind.1.9.8 -> installed topkg.1.0.8 -> installed uutf.1.0.4 -> installed mtime.2.1.0 -> installed ptime.1.2.0 -> installed dune.3.17.2 -> installed jane-street-headers.v0.17.0 -> installed ppx_derivers.1.2.1 -> installed backoff.0.1.1 -> installed printbox.0.12 -> installed bigarray-compat.1.1.0 -> installed csexp.1.5.2 -> installed cppo.1.8.0 -> installed multicore-magic.2.3.1 -> installed ocaml-syntax-shims.1.0.0 -> installed ocaml-compiler-libs.v0.17.0 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed re.1.12.0 -> installed sexplib0.v0.17.0 -> installed stdlib-shims.0.3.0 -> installed saturn_lockfree.0.5.0 -> installed integers.0.7.0 -> installed parsexp.v0.17.0 -> installed dune-configurator.3.17.2 -> installed bigstringaf.0.10.0 -> installed sexplib.v0.17.0 -> installed angstrom.0.16.1 -> installed tyxml.4.6.0 -> installed uucp.16.0.0 -> installed printbox-html.0.12 -> installed printbox-text.0.12 -> installed printbox-md.0.12 -> installed printbox-ext-plot.0.12 -> installed ctypes.0.23.0 -> installed base.v0.17.1 -> installed ctypes-foreign.0.23.0 -> installed fieldslib.v0.17.0 -> installed variantslib.v0.17.0 -> installed stdio.v0.17.0 -> installed ppxlib.0.35.0 -> installed ppx_optcomp.v0.17.0 -> installed ppxlib_jane.v0.17.2 -> installed ppx_cold.v0.17.0 -> installed ppx_here.v0.17.0 -> installed ppx_variants_conv.v0.17.0 -> installed ppx_fields_conv.v0.17.0 -> installed ppx_enumerate.v0.17.0 -> installed ppx_globalize.v0.17.0 -> installed ppx_deriving.6.0.3 -> installed ppx_compare.v0.17.0 -> installed ppx_sexp_conv.v0.17.0 -> installed ppx_hash.v0.17.0 -> installed ppx_assert.v0.17.0 -> installed ppx_minidebug.2.1.0 -> installed ppx_base.v0.17.0 -> installed jst-config.v0.17.0 -> installed ppx_string.v0.17.0 -> installed time_now.v0.17.0 -> installed ppx_inline_test.v0.17.0 -> installed ppx_expect.v0.17.2 Done. # To update the current shell environment, run: eval $(opam env) 2025-03-21 15:22.32 ---> using "d9717d0a09483614d783e8e0c2ae1544a95a9829a9989513258fe2f9a483585c" from cache /src: (copy (src .) (dst /src)) 2025-03-21 15:22.32 ---> saved as "aafbb45b2422d07350c7260dbfe647559ad0c9082c6cb9f6db8911bbc1f9c83d" /src: (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test_ppx && ./test_ppx_op_expected.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test_ppx && ./test_ppx_op.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Found 0, in the config file' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition '' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/1b8aac4f80bd683b1f67f25ed39b0912/default/test/ocannl_config.' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Retrieving commandline, environment, or config file variable ocannl_log_level' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition primitive_ops.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition zero2hero_1of7.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition hello_world_op.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition einsum_trivia.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition micrograd_demo.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition moons_demo_parallel.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/76727ac3d5dec06cd576c1eb3d92acad/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test && ./moons_demo_parallel_run.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file ("Set log_level to" 1) └─{orphaned from #2} Retrieving commandline, environment, or config file variable ocannl_backend Found cc, in the config file Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Retrieving commandline, environment, or config file variable ocannl_never_capture_stdout Not found, using default false Batch=59, step=60, lr=0.200000, batch loss=23.609453, epoch loss=23.609453 Batch=119, step=120, lr=0.199750, batch loss=8.539634, epoch loss=32.149087 Batch=179, step=180, lr=0.199500, batch loss=2.626295, epoch loss=34.775382 Batch=239, step=240, lr=0.199250, batch loss=0.849657, epoch loss=35.625039 Batch=299, step=300, lr=0.199000, batch loss=1.447177, epoch loss=37.072216 Batch=359, step=360, lr=0.198750, batch loss=1.329296, epoch loss=38.401512 Batch=419, step=420, lr=0.198250, batch loss=0.618569, epoch loss=39.020081 Batch=479, step=480, lr=0.198250, batch loss=0.821468, epoch loss=39.841549 Batch=539, step=540, lr=0.198000, batch loss=0.691338, epoch loss=40.532887 Batch=599, step=600, lr=0.197750, batch loss=1.063206, epoch loss=41.596093 Batch=659, step=660, lr=0.197500, batch loss=0.483478, epoch loss=42.079571 Batch=719, step=720, lr=0.197250, batch loss=0.411326, epoch loss=42.490898 Batch=779, step=780, lr=0.197000, batch loss=0.470075, epoch loss=42.960972 Batch=839, step=840, lr=0.196750, batch loss=0.446754, epoch loss=43.407726 Batch=899, step=900, lr=0.196500, batch loss=0.382734, epoch loss=43.790461 Batch=959, step=960, lr=0.196250, batch loss=0.244931, epoch loss=44.035392 Batch=1019, step=1020, lr=0.196000, batch loss=0.466979, epoch loss=44.502372 Batch=1079, step=1080, lr=0.195750, batch loss=0.248417, epoch loss=44.750788 Batch=1139, step=1140, lr=0.195500, batch loss=0.317817, epoch loss=45.068605 Batch=1199, step=1200, lr=0.195250, batch loss=0.263596, epoch loss=45.332201 Epoch=0, step=1200, lr=0.195250, epoch loss=45.332201 Batch=59, step=1260, lr=0.195000, batch loss=0.262061, epoch loss=0.262061 Batch=119, step=1320, lr=0.194750, batch loss=0.205387, epoch loss=0.467448 Batch=179, step=1380, lr=0.194500, batch loss=0.243651, epoch loss=0.711099 Batch=239, step=1440, lr=0.194250, batch loss=0.347762, epoch loss=1.058861 Batch=299, step=1500, lr=0.194000, batch loss=0.247603, epoch loss=1.306463 Batch=359, step=1560, lr=0.193750, batch loss=0.316389, epoch loss=1.622852 Batch=419, step=1620, lr=0.193500, batch loss=0.308846, epoch loss=1.931698 Batch=479, step=1680, lr=0.193250, batch loss=0.275052, epoch loss=2.206750 Batch=539, step=1740, lr=0.193000, batch loss=0.210887, epoch loss=2.417637 Batch=599, step=1800, lr=0.192750, batch loss=0.261849, epoch loss=2.679486 Batch=659, step=1860, lr=0.192500, batch loss=0.375706, epoch loss=3.055191 Batch=719, step=1920, lr=0.192250, batch loss=0.357921, epoch loss=3.413113 Batch=779, step=1980, lr=0.192000, batch loss=0.386996, epoch loss=3.800108 Batch=839, step=2040, lr=0.191750, batch loss=0.346378, epoch loss=4.146487 Batch=899, step=2100, lr=0.191500, batch loss=0.319125, epoch loss=4.465612 Batch=959, step=2160, lr=0.191250, batch loss=0.246311, epoch loss=4.711923 Batch=1019, step=2220, lr=0.191000, batch loss=0.377695, epoch loss=5.089618 Batch=1079, step=2280, lr=0.190750, batch loss=0.216504, epoch loss=5.306122 Batch=1139, step=2340, lr=0.190500, batch loss=0.265947, epoch loss=5.572069 Batch=1199, step=2400, lr=0.190250, batch loss=0.212202, epoch loss=5.784271 Epoch=1, step=2400, lr=0.190250, epoch loss=5.784271 Batch=59, step=2460, lr=0.190000, batch loss=0.233794, epoch loss=0.233794 Batch=119, step=2520, lr=0.189750, batch loss=0.195959, epoch loss=0.429753 Batch=179, step=2580, lr=0.189500, batch loss=0.221247, epoch loss=0.651000 Batch=239, step=2640, lr=0.189250, batch loss=0.329011, epoch loss=0.980010 Batch=299, step=2700, lr=0.189000, batch loss=0.207000, epoch loss=1.187011 Batch=359, step=2760, lr=0.188750, batch loss=0.292758, epoch loss=1.479768 Batch=419, step=2820, lr=0.188500, batch loss=0.281216, epoch loss=1.760984 Batch=479, step=2880, lr=0.188250, batch loss=0.255321, epoch loss=2.016305 Batch=539, step=2940, lr=0.188000, batch loss=0.196075, epoch loss=2.212380 Batch=599, step=3000, lr=0.187750, batch loss=0.240748, epoch loss=2.453128 Batch=659, step=3060, lr=0.187500, batch loss=0.348711, epoch loss=2.801839 Batch=719, step=3120, lr=0.187250, batch loss=0.347763, epoch loss=3.149602 Batch=779, step=3180, lr=0.187000, batch loss=0.364224, epoch loss=3.513826 Batch=839, step=3240, lr=0.186750, batch loss=0.324788, epoch loss=3.838614 Batch=899, step=3300, lr=0.186500, batch loss=0.289063, epoch loss=4.127677 Batch=959, step=3360, lr=0.186250, batch loss=0.211279, epoch loss=4.338956 Batch=1019, step=3420, lr=0.186000, batch loss=0.308415, epoch loss=4.647371 Batch=1079, step=3480, lr=0.185750, batch loss=0.174633, epoch loss=4.822004 Batch=1139, step=3540, lr=0.185500, batch loss=0.231638, epoch loss=5.053642 Batch=1199, step=3600, lr=0.185250, batch loss=0.199030, epoch loss=5.252672 Epoch=2, step=3600, lr=0.185250, epoch loss=5.252672 Batch=59, step=3660, lr=0.185000, batch loss=0.226799, epoch loss=0.226799 Batch=119, step=3720, lr=0.184750, batch loss=0.192050, epoch loss=0.418849 Batch=179, step=3780, lr=0.184500, batch loss=0.211058, epoch loss=0.629907 Batch=239, step=3840, lr=0.184250, batch loss=0.317851, epoch loss=0.947758 Batch=299, step=3900, lr=0.184000, batch loss=0.207365, epoch loss=1.155123 Batch=359, step=3960, lr=0.183750, batch loss=0.286537, epoch loss=1.441660 Batch=419, step=4020, lr=0.183500, batch loss=0.279989, epoch loss=1.721649 Batch=479, step=4080, lr=0.183250, batch loss=0.254150, epoch loss=1.975799 Batch=539, step=4140, lr=0.183000, batch loss=0.203195, epoch loss=2.178994 Batch=599, step=4200, lr=0.182750, batch loss=0.248914, epoch loss=2.427908 Batch=659, step=4260, lr=0.182500, batch loss=0.330545, epoch loss=2.758453 Batch=719, step=4320, lr=0.182250, batch loss=0.331395, epoch loss=3.089848 Batch=779, step=4380, lr=0.182000, batch loss=0.358263, epoch loss=3.448110 Batch=839, step=4440, lr=0.181750, batch loss=0.318436, epoch loss=3.766547 Batch=899, step=4500, lr=0.181500, batch loss=0.291240, epoch loss=4.057787 Batch=959, step=4560, lr=0.181250, batch loss=0.242941, epoch loss=4.300728 Batch=1019, step=4620, lr=0.181000, batch loss=0.339738, epoch loss=4.640466 Batch=1079, step=4680, lr=0.180500, batch loss=0.201131, epoch loss=4.841597 Batch=1139, step=4740, lr=0.180500, batch loss=0.243130, epoch loss=5.084726 Batch=1199, step=4800, lr=0.180250, batch loss=0.192633, epoch loss=5.277359 Epoch=3, step=4800, lr=0.180250, epoch loss=5.277359 Batch=59, step=4860, lr=0.180000, batch loss=0.228570, epoch loss=0.228570 Batch=119, step=4920, lr=0.179750, batch loss=0.188835, epoch loss=0.417405 Batch=179, step=4980, lr=0.179500, batch loss=0.205640, epoch loss=0.623044 Batch=239, step=5040, lr=0.179250, batch loss=0.308747, epoch loss=0.931791 Batch=299, step=5100, lr=0.179000, batch loss=0.204894, epoch loss=1.136685 Batch=359, step=5160, lr=0.178750, batch loss=0.271452, epoch loss=1.408137 Batch=419, step=5220, lr=0.178500, batch loss=0.265136, epoch loss=1.673273 Batch=479, step=5280, lr=0.178250, batch loss=0.241909, epoch loss=1.915182 Batch=539, step=5340, lr=0.178000, batch loss=0.194216, epoch loss=2.109398 Batch=599, step=5400, lr=0.177750, batch loss=0.237123, epoch loss=2.346521 Batch=659, step=5460, lr=0.177500, batch loss=0.320646, epoch loss=2.667167 Batch=719, step=5520, lr=0.177250, batch loss=0.323846, epoch loss=2.991013 Batch=779, step=5580, lr=0.177000, batch loss=0.339773, epoch loss=3.330785 Batch=839, step=5640, lr=0.176750, batch loss=0.311675, epoch loss=3.642461 Batch=899, step=5700, lr=0.176500, batch loss=0.271699, epoch loss=3.914160 Batch=959, step=5760, lr=0.176250, batch loss=0.216886, epoch loss=4.131046 Batch=1019, step=5820, lr=0.176000, batch loss=0.334636, epoch loss=4.465682 Batch=1079, step=5880, lr=0.175750, batch loss=0.191347, epoch loss=4.657029 Batch=1139, step=5940, lr=0.175500, batch loss=0.220400, epoch loss=4.877428 Batch=1199, step=6000, lr=0.175250, batch loss=0.186498, epoch loss=5.063927 Epoch=4, step=6000, lr=0.175250, epoch loss=5.063927 Batch=59, step=6060, lr=0.175000, batch loss=0.223713, epoch loss=0.223713 Batch=119, step=6120, lr=0.174750, batch loss=0.185306, epoch loss=0.409018 Batch=179, step=6180, lr=0.174500, batch loss=0.200503, epoch loss=0.609521 Batch=239, step=6240, lr=0.174250, batch loss=0.298531, epoch loss=0.908052 Batch=299, step=6300, lr=0.174000, batch loss=0.197043, epoch loss=1.105095 Batch=359, step=6360, lr=0.173750, batch loss=0.265292, epoch loss=1.370387 Batch=419, step=6420, lr=0.173500, batch loss=0.260808, epoch loss=1.631196 Batch=479, step=6480, lr=0.173250, batch loss=0.240462, epoch loss=1.871657 Batch=539, step=6540, lr=0.173000, batch loss=0.195559, epoch loss=2.067217 Batch=599, step=6600, lr=0.172750, batch loss=0.231653, epoch loss=2.298870 Batch=659, step=6660, lr=0.172500, batch loss=0.313821, epoch loss=2.612691 Batch=719, step=6720, lr=0.172250, batch loss=0.316602, epoch loss=2.929292 Batch=779, step=6780, lr=0.172000, batch loss=0.329162, epoch loss=3.258455 Batch=839, step=6840, lr=0.171750, batch loss=0.306161, epoch loss=3.564615 Batch=899, step=6900, lr=0.171500, batch loss=0.269032, epoch loss=3.833648 Batch=959, step=6960, lr=0.171250, batch loss=0.208069, epoch loss=4.041717 Batch=1019, step=7020, lr=0.171000, batch loss=0.327026, epoch loss=4.368743 Batch=1079, step=7080, lr=0.170750, batch loss=0.174565, epoch loss=4.543309 Batch=1139, step=7140, lr=0.170500, batch loss=0.212505, epoch loss=4.755814 Batch=1199, step=7200, lr=0.170250, batch loss=0.181435, epoch loss=4.937249 Epoch=5, step=7200, lr=0.170250, epoch loss=4.937249 Batch=59, step=7260, lr=0.170000, batch loss=0.242580, epoch loss=0.242580 Batch=119, step=7320, lr=0.169750, batch loss=0.179804, epoch loss=0.422384 Batch=179, step=7380, lr=0.169500, batch loss=0.194063, epoch loss=0.616447 Batch=239, step=7440, lr=0.169250, batch loss=0.290669, epoch loss=0.907116 Batch=299, step=7500, lr=0.169000, batch loss=0.206197, epoch loss=1.113313 Batch=359, step=7560, lr=0.168750, batch loss=0.261619, epoch loss=1.374932 Batch=419, step=7620, lr=0.168500, batch loss=0.252593, epoch loss=1.627525 Batch=479, step=7680, lr=0.168250, batch loss=0.231605, epoch loss=1.859130 Batch=539, step=7740, lr=0.168000, batch loss=0.192073, epoch loss=2.051203 Batch=599, step=7800, lr=0.167750, batch loss=0.225691, epoch loss=2.276894 Batch=659, step=7860, lr=0.167500, batch loss=0.305121, epoch loss=2.582015 Batch=719, step=7920, lr=0.167250, batch loss=0.308402, epoch loss=2.890417 Batch=779, step=7980, lr=0.167000, batch loss=0.329933, epoch loss=3.220350 Batch=839, step=8040, lr=0.166750, batch loss=0.291245, epoch loss=3.511595 Batch=899, step=8100, lr=0.166500, batch loss=0.261553, epoch loss=3.773147 Batch=959, step=8160, lr=0.166250, batch loss=0.199462, epoch loss=3.972609 Batch=1019, step=8220, lr=0.166000, batch loss=0.330166, epoch loss=4.302774 Batch=1079, step=8280, lr=0.165750, batch loss=0.207160, epoch loss=4.509934 Batch=1139, step=8340, lr=0.165500, batch loss=0.220302, epoch loss=4.730236 Batch=1199, step=8400, lr=0.165250, batch loss=0.174432, epoch loss=4.904668 Epoch=6, step=8400, lr=0.165250, epoch loss=4.904668 Batch=59, step=8460, lr=0.165000, batch loss=0.212369, epoch loss=0.212369 Batch=119, step=8520, lr=0.164750, batch loss=0.177101, epoch loss=0.389470 Batch=179, step=8580, lr=0.164500, batch loss=0.188418, epoch loss=0.577889 Batch=239, step=8640, lr=0.164250, batch loss=0.275551, epoch loss=0.853440 Batch=299, step=8700, lr=0.164000, batch loss=0.189863, epoch loss=1.043303 Batch=359, step=8760, lr=0.163750, batch loss=0.247832, epoch loss=1.291135 Batch=419, step=8820, lr=0.163500, batch loss=0.244810, epoch loss=1.535945 Batch=479, step=8880, lr=0.163250, batch loss=0.226774, epoch loss=1.762718 Batch=539, step=8940, lr=0.163000, batch loss=0.178675, epoch loss=1.941393 Batch=599, step=9000, lr=0.162750, batch loss=0.215744, epoch loss=2.157137 Batch=659, step=9060, lr=0.162500, batch loss=0.294682, epoch loss=2.451819 Batch=719, step=9120, lr=0.162250, batch loss=0.293154, epoch loss=2.744973 Batch=779, step=9180, lr=0.162000, batch loss=0.315491, epoch loss=3.060464 Batch=839, step=9240, lr=0.161750, batch loss=0.285607, epoch loss=3.346071 Batch=899, step=9300, lr=0.161500, batch loss=0.251415, epoch loss=3.597486 Batch=959, step=9360, lr=0.161250, batch loss=0.187794, epoch loss=3.785281 Batch=1019, step=9420, lr=0.161000, batch loss=0.316483, epoch loss=4.101764 Batch=1079, step=9480, lr=0.160750, batch loss=0.185270, epoch loss=4.287034 Batch=1139, step=9540, lr=0.160500, batch loss=0.206393, epoch loss=4.493427 Batch=1199, step=9600, lr=0.160250, batch loss=0.165997, epoch loss=4.659424 Epoch=7, step=9600, lr=0.160250, epoch loss=4.659424 Batch=59, step=9660, lr=0.160000, batch loss=0.200247, epoch loss=0.200247 Batch=119, step=9720, lr=0.159750, batch loss=0.163412, epoch loss=0.363658 Batch=179, step=9780, lr=0.159500, batch loss=0.178134, epoch loss=0.541792 Batch=239, step=9840, lr=0.159250, batch loss=0.259581, epoch loss=0.801373 Batch=299, step=9900, lr=0.159000, batch loss=0.185665, epoch loss=0.987038 Batch=359, step=9960, lr=0.158750, batch loss=0.238343, epoch loss=1.225381 Batch=419, step=10020, lr=0.158500, batch loss=0.232099, epoch loss=1.457480 Batch=479, step=10080, lr=0.158250, batch loss=0.219159, epoch loss=1.676639 Batch=539, step=10140, lr=0.158000, batch loss=0.169056, epoch loss=1.845695 Batch=599, step=10200, lr=0.157750, batch loss=0.202377, epoch loss=2.048073 Batch=659, step=10260, lr=0.157500, batch loss=0.279669, epoch loss=2.327741 Batch=719, step=10320, lr=0.157250, batch loss=0.286915, epoch loss=2.614656 Batch=779, step=10380, lr=0.157000, batch loss=0.292067, epoch loss=2.906723 Batch=839, step=10440, lr=0.156750, batch loss=0.264949, epoch loss=3.171673 Batch=899, step=10500, lr=0.156500, batch loss=0.240937, epoch loss=3.412609 Batch=959, step=10560, lr=0.156250, batch loss=0.196287, epoch loss=3.608896 Batch=1019, step=10620, lr=0.156000, batch loss=0.277817, epoch loss=3.886713 Batch=1079, step=10680, lr=0.155750, batch loss=0.149454, epoch loss=4.036167 Batch=1139, step=10740, lr=0.155500, batch loss=0.179625, epoch loss=4.215792 Batch=1199, step=10800, lr=0.155250, batch loss=0.164071, epoch loss=4.379863 Epoch=8, step=10800, lr=0.155250, epoch loss=4.379863 Batch=59, step=10860, lr=0.155000, batch loss=0.190325, epoch loss=0.190325 Batch=119, step=10920, lr=0.154750, batch loss=0.163743, epoch loss=0.354068 Batch=179, step=10980, lr=0.154500, batch loss=0.164514, epoch loss=0.518582 Batch=239, step=11040, lr=0.154250, batch loss=0.240526, epoch loss=0.759108 Batch=299, step=11100, lr=0.154000, batch loss=0.168162, epoch loss=0.927270 Batch=359, step=11160, lr=0.153750, batch loss=0.218974, epoch loss=1.146244 Batch=419, step=11220, lr=0.153500, batch loss=0.217238, epoch loss=1.363481 Batch=479, step=11280, lr=0.153250, batch loss=0.209081, epoch loss=1.572563 Batch=539, step=11340, lr=0.153000, batch loss=0.164842, epoch loss=1.737405 Batch=599, step=11400, lr=0.152750, batch loss=0.178023, epoch loss=1.915428 Batch=659, step=11460, lr=0.152500, batch loss=0.264136, epoch loss=2.179565 Batch=719, step=11520, lr=0.152250, batch loss=0.255758, epoch loss=2.435323 Batch=779, step=11580, lr=0.152000, batch loss=0.270734, epoch loss=2.706056 Batch=839, step=11640, lr=0.151750, batch loss=0.254648, epoch loss=2.960705 Batch=899, step=11700, lr=0.151500, batch loss=0.211497, epoch loss=3.172202 Batch=959, step=11760, lr=0.151250, batch loss=0.166603, epoch loss=3.338804 Batch=1019, step=11820, lr=0.151000, batch loss=0.264806, epoch loss=3.603611 Batch=1079, step=11880, lr=0.150750, batch loss=0.144333, epoch loss=3.747944 Batch=1139, step=11940, lr=0.150500, batch loss=0.188264, epoch loss=3.936208 Batch=1199, step=12000, lr=0.150250, batch loss=0.137508, epoch loss=4.073716 Epoch=9, step=12000, lr=0.150250, epoch loss=4.073716 Batch=59, step=12060, lr=0.150000, batch loss=0.159449, epoch loss=0.159449 Batch=119, step=12120, lr=0.149750, batch loss=0.135504, epoch loss=0.294953 Batch=179, step=12180, lr=0.149500, batch loss=0.149628, epoch loss=0.444581 Batch=239, step=12240, lr=0.149250, batch loss=0.216790, epoch loss=0.661371 Batch=299, step=12300, lr=0.149000, batch loss=0.139648, epoch loss=0.801019 Batch=359, step=12360, lr=0.148750, batch loss=0.195931, epoch loss=0.996949 Batch=419, step=12420, lr=0.148500, batch loss=0.204192, epoch loss=1.201141 Batch=479, step=12480, lr=0.148250, batch loss=0.176490, epoch loss=1.377631 Batch=539, step=12540, lr=0.148000, batch loss=0.140732, epoch loss=1.518363 Batch=599, step=12600, lr=0.147750, batch loss=0.146240, epoch loss=1.664603 Batch=659, step=12660, lr=0.147500, batch loss=0.219503, epoch loss=1.884106 Batch=719, step=12720, lr=0.147250, batch loss=0.230481, epoch loss=2.114587 Batch=779, step=12780, lr=0.147000, batch loss=0.245806, epoch loss=2.360393 Batch=839, step=12840, lr=0.146750, batch loss=0.225714, epoch loss=2.586107 Batch=899, step=12900, lr=0.146500, batch loss=0.184875, epoch loss=2.770981 Batch=959, step=12960, lr=0.146250, batch loss=0.167398, epoch loss=2.938379 Batch=1019, step=13020, lr=0.146000, batch loss=0.283656, epoch loss=3.222035 Batch=1079, step=13080, lr=0.145750, batch loss=0.109833, epoch loss=3.331868 Batch=1139, step=13140, lr=0.145500, batch loss=0.153405, epoch loss=3.485272 Batch=1199, step=13200, lr=0.145250, batch loss=0.116101, epoch loss=3.601373 Epoch=10, step=13200, lr=0.145250, epoch loss=3.601373 Batch=59, step=13260, lr=0.145000, batch loss=0.141073, epoch loss=0.141073 Batch=119, step=13320, lr=0.144750, batch loss=0.117729, epoch loss=0.258803 Batch=179, step=13380, lr=0.144500, batch loss=0.125738, epoch loss=0.384541 Batch=239, step=13440, lr=0.144250, batch loss=0.185351, epoch loss=0.569892 Batch=299, step=13500, lr=0.144000, batch loss=0.111038, epoch loss=0.680931 Batch=359, step=13560, lr=0.143750, batch loss=0.157802, epoch loss=0.838733 Batch=419, step=13620, lr=0.143500, batch loss=0.157072, epoch loss=0.995805 Batch=479, step=13680, lr=0.143250, batch loss=0.142811, epoch loss=1.138616 Batch=539, step=13740, lr=0.143000, batch loss=0.115148, epoch loss=1.253764 Batch=599, step=13800, lr=0.142750, batch loss=0.117133, epoch loss=1.370897 Batch=659, step=13860, lr=0.142500, batch loss=0.171428, epoch loss=1.542325 Batch=719, step=13920, lr=0.142250, batch loss=0.168961, epoch loss=1.711286 Batch=779, step=13980, lr=0.142000, batch loss=0.193262, epoch loss=1.904548 Batch=839, step=14040, lr=0.141750, batch loss=0.200657, epoch loss=2.105205 Batch=899, step=14100, lr=0.141500, batch loss=0.221730, epoch loss=2.326935 Batch=959, step=14160, lr=0.141250, batch loss=0.096027, epoch loss=2.422962 Batch=1019, step=14220, lr=0.141000, batch loss=0.199086, epoch loss=2.622048 Batch=1079, step=14280, lr=0.140750, batch loss=0.072677, epoch loss=2.694725 Batch=1139, step=14340, lr=0.140500, batch loss=0.109859, epoch loss=2.804585 Batch=1199, step=14400, lr=0.140250, batch loss=0.082220, epoch loss=2.886805 Epoch=11, step=14400, lr=0.140250, epoch loss=2.886805 Batch=59, step=14460, lr=0.140000, batch loss=0.102922, epoch loss=0.102922 Batch=119, step=14520, lr=0.139750, batch loss=0.103756, epoch loss=0.206678 Batch=179, step=14580, lr=0.139500, batch loss=0.096499, epoch loss=0.303177 Batch=239, step=14640, lr=0.139250, batch loss=0.137490, epoch loss=0.440667 Batch=299, step=14700, lr=0.139000, batch loss=0.072094, epoch loss=0.512761 Batch=359, step=14760, lr=0.138750, batch loss=0.119017, epoch loss=0.631778 Batch=419, step=14820, lr=0.138500, batch loss=0.133508, epoch loss=0.765286 Batch=479, step=14880, lr=0.138250, batch loss=0.089271, epoch loss=0.854557 Batch=539, step=14940, lr=0.138000, batch loss=0.087736, epoch loss=0.942293 Batch=599, step=15000, lr=0.137750, batch loss=0.080728, epoch loss=1.023022 Batch=659, step=15060, lr=0.137500, batch loss=0.131589, epoch loss=1.154611 Batch=719, step=15120, lr=0.137250, batch loss=0.172677, epoch loss=1.327287 Batch=779, step=15180, lr=0.137000, batch loss=0.371090, epoch loss=1.698378 Batch=839, step=15240, lr=0.136750, batch loss=0.128913, epoch loss=1.827291 Batch=899, step=15300, lr=0.136500, batch loss=0.103782, epoch loss=1.931072 Batch=959, step=15360, lr=0.136250, batch loss=0.074626, epoch loss=2.005698 Batch=1019, step=15420, lr=0.136000, batch loss=0.161017, epoch loss=2.166715 Batch=1079, step=15480, lr=0.135750, batch loss=0.042971, epoch loss=2.209686 Batch=1139, step=15540, lr=0.135500, batch loss=0.093455, epoch loss=2.303141 Batch=1199, step=15600, lr=0.135250, batch loss=0.056530, epoch loss=2.359671 Epoch=12, step=15600, lr=0.135250, epoch loss=2.359671 Batch=59, step=15660, lr=0.135000, batch loss=0.073330, epoch loss=0.073330 Batch=119, step=15720, lr=0.134750, batch loss=0.119745, epoch loss=0.193075 Batch=179, step=15780, lr=0.134500, batch loss=0.090547, epoch loss=0.283622 Batch=239, step=15840, lr=0.134250, batch loss=0.091723, epoch loss=0.375345 Batch=299, step=15900, lr=0.134000, batch loss=0.037500, epoch loss=0.412846 Batch=359, step=15960, lr=0.133750, batch loss=0.081203, epoch loss=0.494049 Batch=419, step=16020, lr=0.133500, batch loss=0.075336, epoch loss=0.569385 Batch=479, step=16080, lr=0.133250, batch loss=0.053611, epoch loss=0.622997 Batch=539, step=16140, lr=0.133000, batch loss=0.072248, epoch loss=0.695245 Batch=599, step=16200, lr=0.132750, batch loss=0.175436, epoch loss=0.870681 Batch=659, step=16260, lr=0.132500, batch loss=0.096037, epoch loss=0.966718 Batch=719, step=16320, lr=0.132250, batch loss=0.115693, epoch loss=1.082411 Batch=779, step=16380, lr=0.132000, batch loss=0.276526, epoch loss=1.358937 Batch=839, step=16440, lr=0.131750, batch loss=0.088782, epoch loss=1.447719 Batch=899, step=16500, lr=0.131500, batch loss=0.078123, epoch loss=1.525842 Batch=959, step=16560, lr=0.131250, batch loss=0.029896, epoch loss=1.555738 Batch=1019, step=16620, lr=0.131000, batch loss=0.058185, epoch loss=1.613923 Batch=1079, step=16680, lr=0.130750, batch loss=0.055546, epoch loss=1.669469 Batch=1139, step=16740, lr=0.130500, batch loss=0.103958, epoch loss=1.773427 Batch=1199, step=16800, lr=0.130250, batch loss=0.047735, epoch loss=1.821162 Epoch=13, step=16800, lr=0.130250, epoch loss=1.821162 Batch=59, step=16860, lr=0.130000, batch loss=0.033291, epoch loss=0.033291 Batch=119, step=16920, lr=0.129750, batch loss=0.031963, epoch loss=0.065254 Batch=179, step=16980, lr=0.129500, batch loss=0.042304, epoch loss=0.107558 Batch=239, step=17040, lr=0.129250, batch loss=0.057542, epoch loss=0.165100 Batch=299, step=17100, lr=0.129000, batch loss=0.019380, epoch loss=0.184480 Batch=359, step=17160, lr=0.128750, batch loss=0.045574, epoch loss=0.230054 Batch=419, step=17220, lr=0.128500, batch loss=0.081708, epoch loss=0.311763 Batch=479, step=17280, lr=0.128250, batch loss=0.024507, epoch loss=0.336270 Batch=539, step=17340, lr=0.128000, batch loss=0.032244, epoch loss=0.368513 Batch=599, step=17400, lr=0.127750, batch loss=0.035379, epoch loss=0.403892 Batch=659, step=17460, lr=0.127500, batch loss=0.046261, epoch loss=0.450152 Batch=719, step=17520, lr=0.127250, batch loss=0.046007, epoch loss=0.496159 Batch=779, step=17580, lr=0.127000, batch loss=0.105058, epoch loss=0.601217 Batch=839, step=17640, lr=0.126750, batch loss=0.099621, epoch loss=0.700838 Batch=899, step=17700, lr=0.126500, batch loss=0.150032, epoch loss=0.850870 Batch=959, step=17760, lr=0.126250, batch loss=0.020837, epoch loss=0.871707 Batch=1019, step=17820, lr=0.126000, batch loss=0.035012, epoch loss=0.906719 Batch=1079, step=17880, lr=0.125750, batch loss=0.009941, epoch loss=0.916660 Batch=1139, step=17940, lr=0.125500, batch loss=0.028550, epoch loss=0.945210 Batch=1199, step=18000, lr=0.125250, batch loss=0.013127, epoch loss=0.958337 Epoch=14, step=18000, lr=0.125250, epoch loss=0.958337 Batch=59, step=18060, lr=0.125000, batch loss=0.010784, epoch loss=0.010784 Batch=119, step=18120, lr=0.124750, batch loss=0.017476, epoch loss=0.028260 Batch=179, step=18180, lr=0.124500, batch loss=0.028141, epoch loss=0.056401 Batch=239, step=18240, lr=0.124250, batch loss=0.035344, epoch loss=0.091746 Batch=299, step=18300, lr=0.124000, batch loss=0.018227, epoch loss=0.109973 Batch=359, step=18360, lr=0.123750, batch loss=0.022453, epoch loss=0.132426 Batch=419, step=18420, lr=0.123500, batch loss=0.029983, epoch loss=0.162410 Batch=479, step=18480, lr=0.123250, batch loss=0.017020, epoch loss=0.179430 Batch=539, step=18540, lr=0.123000, batch loss=0.025844, epoch loss=0.205274 Batch=599, step=18600, lr=0.122750, batch loss=0.025334, epoch loss=0.230608 Batch=659, step=18660, lr=0.122500, batch loss=0.029169, epoch loss=0.259777 Batch=719, step=18720, lr=0.122250, batch loss=0.023378, epoch loss=0.283155 Batch=779, step=18780, lr=0.122000, batch loss=0.039546, epoch loss=0.322702 Batch=839, step=18840, lr=0.121750, batch loss=0.068082, epoch loss=0.390783 Batch=899, step=18900, lr=0.121500, batch loss=0.035385, epoch loss=0.426168 Batch=959, step=18960, lr=0.121250, batch loss=0.013250, epoch loss=0.439418 Batch=1019, step=19020, lr=0.121000, batch loss=0.022145, epoch loss=0.461562 Batch=1079, step=19080, lr=0.120750, batch loss=0.012607, epoch loss=0.474169 Batch=1139, step=19140, lr=0.120500, batch loss=0.022514, epoch loss=0.496683 Batch=1199, step=19200, lr=0.120250, batch loss=0.009672, epoch loss=0.506355 Epoch=15, step=19200, lr=0.120250, epoch loss=0.506355 Batch=59, step=19260, lr=0.120000, batch loss=0.005842, epoch loss=0.005842 Batch=119, step=19320, lr=0.119750, batch loss=0.016653, epoch loss=0.022495 Batch=179, step=19380, lr=0.119500, batch loss=0.049700, epoch loss=0.072195 Batch=239, step=19440, lr=0.119250, batch loss=0.019469, epoch loss=0.091664 Batch=299, step=19500, lr=0.119000, batch loss=0.004995, epoch loss=0.096659 Batch=359, step=19560, lr=0.118750, batch loss=0.017207, epoch loss=0.113867 Batch=419, step=19620, lr=0.118500, batch loss=0.019417, epoch loss=0.133284 Batch=479, step=19680, lr=0.118250, batch loss=0.008380, epoch loss=0.141663 Batch=539, step=19740, lr=0.118000, batch loss=0.017146, epoch loss=0.158810 Batch=599, step=19800, lr=0.117750, batch loss=0.022279, epoch loss=0.181089 Batch=659, step=19860, lr=0.117500, batch loss=0.017757, epoch loss=0.198846 Batch=719, step=19920, lr=0.117250, batch loss=0.041895, epoch loss=0.240740 Batch=779, step=19980, lr=0.117000, batch loss=0.088319, epoch loss=0.329060 Batch=839, step=20040, lr=0.116750, batch loss=0.031715, epoch loss=0.360775 Batch=899, step=20100, lr=0.116500, batch loss=0.027422, epoch loss=0.388196 Batch=959, step=20160, lr=0.116250, batch loss=0.017591, epoch loss=0.405788 Batch=1019, step=20220, lr=0.116000, batch loss=0.029250, epoch loss=0.435038 Batch=1079, step=20280, lr=0.115750, batch loss=0.002570, epoch loss=0.437608 Batch=1139, step=20340, lr=0.115500, batch loss=0.014645, epoch loss=0.452253 Batch=1199, step=20400, lr=0.115250, batch loss=0.006350, epoch loss=0.458604 Epoch=16, step=20400, lr=0.115250, epoch loss=0.458604 Batch=59, step=20460, lr=0.115000, batch loss=0.004048, epoch loss=0.004048 Batch=119, step=20520, lr=0.114750, batch loss=0.009331, epoch loss=0.013378 Batch=179, step=20580, lr=0.114500, batch loss=0.018911, epoch loss=0.032289 Batch=239, step=20640, lr=0.114250, batch loss=0.021209, epoch loss=0.053498 Batch=299, step=20700, lr=0.114000, batch loss=0.008384, epoch loss=0.061882 Batch=359, step=20760, lr=0.113750, batch loss=0.013997, epoch loss=0.075879 Batch=419, step=20820, lr=0.113500, batch loss=0.013551, epoch loss=0.089430 Batch=479, step=20880, lr=0.113250, batch loss=0.003488, epoch loss=0.092918 Batch=539, step=20940, lr=0.113000, batch loss=0.019903, epoch loss=0.112820 Batch=599, step=21000, lr=0.112750, batch loss=0.019266, epoch loss=0.132087 Batch=659, step=21060, lr=0.112500, batch loss=0.015215, epoch loss=0.147301 Batch=719, step=21120, lr=0.112250, batch loss=0.041217, epoch loss=0.188518 Batch=779, step=21180, lr=0.112000, batch loss=0.064319, epoch loss=0.252837 Batch=839, step=21240, lr=0.111750, batch loss=0.023954, epoch loss=0.276791 Batch=899, step=21300, lr=0.111500, batch loss=0.028625, epoch loss=0.305416 Batch=959, step=21360, lr=0.111250, batch loss=0.010154, epoch loss=0.315570 Batch=1019, step=21420, lr=0.111000, batch loss=0.010823, epoch loss=0.326393 Batch=1079, step=21480, lr=0.110750, batch loss=0.001444, epoch loss=0.327837 Batch=1139, step=21540, lr=0.110500, batch loss=0.012791, epoch loss=0.340628 Batch=1199, step=21600, lr=0.110250, batch loss=0.005194, epoch loss=0.345822 Epoch=17, step=21600, lr=0.110250, epoch loss=0.345822 Batch=59, step=21660, lr=0.110000, batch loss=0.002827, epoch loss=0.002827 Batch=119, step=21720, lr=0.109750, batch loss=0.006395, epoch loss=0.009221 Batch=179, step=21780, lr=0.109500, batch loss=0.012592, epoch loss=0.021813 Batch=239, step=21840, lr=0.109250, batch loss=0.009376, epoch loss=0.031189 Batch=299, step=21900, lr=0.109000, batch loss=0.004070, epoch loss=0.035259 Batch=359, step=21960, lr=0.108750, batch loss=0.014130, epoch loss=0.049389 Batch=419, step=22020, lr=0.108500, batch loss=0.012032, epoch loss=0.061421 Batch=479, step=22080, lr=0.108250, batch loss=0.003251, epoch loss=0.064672 Batch=539, step=22140, lr=0.108000, batch loss=0.017895, epoch loss=0.082568 Batch=599, step=22200, lr=0.107750, batch loss=0.016934, epoch loss=0.099502 Batch=659, step=22260, lr=0.107500, batch loss=0.015584, epoch loss=0.115086 Batch=719, step=22320, lr=0.107250, batch loss=0.023284, epoch loss=0.138370 Batch=779, step=22380, lr=0.107000, batch loss=0.037606, epoch loss=0.175976 Batch=839, step=22440, lr=0.106750, batch loss=0.023106, epoch loss=0.199082 Batch=899, step=22500, lr=0.106500, batch loss=0.029584, epoch loss=0.228666 Batch=959, step=22560, lr=0.106250, batch loss=0.009165, epoch loss=0.237831 Batch=1019, step=22620, lr=0.106000, batch loss=0.009336, epoch loss=0.247168 Batch=1079, step=22680, lr=0.105750, batch loss=0.000889, epoch loss=0.248057 Batch=1139, step=22740, lr=0.105500, batch loss=0.011202, epoch loss=0.259258 Batch=1199, step=22800, lr=0.105250, batch loss=0.004550, epoch loss=0.263808 Epoch=18, step=22800, lr=0.105250, epoch loss=0.263808 Batch=59, step=22860, lr=0.105000, batch loss=0.001677, epoch loss=0.001677 Batch=119, step=22920, lr=0.104750, batch loss=0.005427, epoch loss=0.007104 Batch=179, step=22980, lr=0.104500, batch loss=0.010929, epoch loss=0.018034 Batch=239, step=23040, lr=0.104250, batch loss=0.008830, epoch loss=0.026863 Batch=299, step=23100, lr=0.104000, batch loss=0.005730, epoch loss=0.032593 Batch=359, step=23160, lr=0.103750, batch loss=0.010586, epoch loss=0.043179 Batch=419, step=23220, lr=0.103500, batch loss=0.010180, epoch loss=0.053359 Batch=479, step=23280, lr=0.103250, batch loss=0.002709, epoch loss=0.056068 Batch=539, step=23340, lr=0.103000, batch loss=0.017192, epoch loss=0.073261 Batch=599, step=23400, lr=0.102750, batch loss=0.013379, epoch loss=0.086640 Batch=659, step=23460, lr=0.102500, batch loss=0.010908, epoch loss=0.097548 Batch=719, step=23520, lr=0.102250, batch loss=0.013902, epoch loss=0.111450 Batch=779, step=23580, lr=0.102000, batch loss=0.022112, epoch loss=0.133563 Batch=839, step=23640, lr=0.101750, batch loss=0.026491, epoch loss=0.160053 Batch=899, step=23700, lr=0.101500, batch loss=0.021303, epoch loss=0.181356 Batch=959, step=23760, lr=0.101250, batch loss=0.007741, epoch loss=0.189097 Batch=1019, step=23820, lr=0.101000, batch loss=0.006419, epoch loss=0.195516 Batch=1079, step=23880, lr=0.100750, batch loss=0.002075, epoch loss=0.197591 Batch=1139, step=23940, lr=0.100500, batch loss=0.008305, epoch loss=0.205896 Batch=1199, step=24000, lr=0.100250, batch loss=0.004436, epoch loss=0.210332 Epoch=19, step=24000, lr=0.100250, epoch loss=0.210332 Half-moons scatterplot and decision boundary: ┌────────────────────────────────────────────────────────────────────────────────────────────────────┐ │********************************#*******************************************************************│ │**********************#*#*#######*###*#####*********************************************************│ │**********************#########################*****************************************************│ │*****************#**########*######*###########*###*************************************************│ │***************#################*###################************************************************│ │************######*#################*#################**********************************************│ │**********#*#####*########*#**************##*#########*#********************************************│ │***********########*##*#******************#*****##########******************************************│ │***********###########*************************############***************************************..│ │********######*####*********************************###*###*#**********************************.....│ │*******######**##**************..******************#*######*#********************************.......│ │*******##*##**##***********..........***************########*##****************************.........│ │*****#######************.......%...%%...***************#########*************************.........%.│ │******######***********.........%........***************##*#####************************......%.%.%.│ │***#########**********.........%%%.%%......*************#*#######**********************......%.%%%%.│ │****#######**********..........%%%%.........************#########*********************.......%%.%%.%│ │**#######************..........%%%%%%%.......**************###*###******************.........%%%%%%.│ │*##*####************...........%%%%%%%.........***********########*****************..........%%%%%%.│ │*#######************...........%%%%%%%..........************#######***************...........%%%%%%.│ │*##*####***********............%%.%%%%%..........************####****************...........%%%%%%%.│ │*#####*#***********.............%%%%%%%............**********##*###************..............%%%%%..│ │#######***********.............%.%%%%%%.............*********#######**********.............%%%%.%%..│ │#####*#**********...............%%%%%%%..............********#######*********..............%%%%%%%%.│ │###*#*#**********...............%%%%%%%%%..............*******######*******................%%%%%%...│ │#######*********.................%%%%%%%%...............*****###*###******................%%%%%%....│ │######**********.................%%%%%%%%%...............****#*###*******...............%%%%%%%%%...│ │*#*##*#********...................%%%%%%%%%%...............***######****.................%%%%%%.....│ │#****##********....................%%%%%%%%%................***###*#**................%.%%%%%%%.....│ │**************.....................%.%%%%%%...................*******..................%.%%.%%......│ │**************.......................%..%%%%%%%................*****..............%.%%%%%%%%%.......│ │*************.........................%.%%%.%%%%................***...............%%%%%%%.%.%.......│ │************............................%..%%%%..%................................%%%%%%%%..........│ │************.............................%%%%%%%%%%%........................%%..%%%%%%%%.%..........│ │***********..............................%%.%%%%%%%%..%....................%..%%%.%%%%%%%...........│ │***********.................................%%%%.%%%%%%%%...............%.%%%%%%%%%%%%.%............│ │**********...................................%%%%%%%%%%%%%%%%%%%%%%.%%%%.%%%%%%%%%%%%%..............│ │**********....................................%%.%%%%%%%%%%%%%%%%%%%%%%.%%%%%%%%%%%.................│ │*********.........................................%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%...................│ │********.............................................%%%.%%%%%%%%%%%%%%%%%%%%%......................│ │********................................................%...%%%%.%%.%%%%..%.........................│ └────────────────────────────────────────────────────────────────────────────────────────────────────┘ 2025-03-21 15:23.00 ---> saved as "4d503d4bdc1d9569a7288d0cda34eb4878abe47da459860d3061feaeb5fa6902" Job succeeded 2025-03-21 15:23.01: Job succeeded