2025-03-20 22:51.40: New job: test ahrefs/ocannl https://github.com/ahrefs/ocannl.git#refs/heads/master (451f36eeac30ee583b682d08902494fd1a3a4f1b) (linux-x86_64:debian-12-5.3+flambda_opam-2.3) Base: ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96 Opam project build To reproduce locally: git clone --recursive "https://github.com/ahrefs/ocannl.git" -b "master" && cd "ocannl" && git reset --hard 451f36ee cat > Dockerfile <<'END-OF-DOCKERFILE' FROM ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96 # debian-12-5.3+flambda_opam-2.3 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" WORKDIR /src RUN sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version WORKDIR /src RUN sudo chown opam /src RUN cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 neural_nets_lib.opam arrayjit.opam ./ RUN opam pin add -yn neural_nets_lib.dev './' && \ opam pin add -yn arrayjit.dev './' RUN echo '(lang dune 3.0)' > './dune-project' ENV DEPS="angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /src RUN opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-03-20 22:51.40: Using cache hint "ahrefs/ocannl-ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96-debian-12-5.3+flambda_opam-2.3-14a85f4c565cc30186c137b219fc7fa2" 2025-03-20 22:51.40: Using OBuilder spec: ((from ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96) (comment debian-12-5.3+flambda_opam-2.3) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (workdir /src) (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (workdir /src) (run (shell "sudo chown opam /src")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) (env DEPS "angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /src)) (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-03-20 22:51.40: Waiting for resource in pool OCluster 2025-03-20 22:51.40: Waiting for worker… 2025-03-20 22:51.42: Got resource from pool OCluster Building on asteria.caelum.ci.dev HEAD is now at 1a4d0ebb fPIC for cc: but only openSUSE complained https://ocaml.ci.dev/github/ahrefs/ocannl/commit/ccaf459c55f1e1dab014a65af54e1ba2ec3b9ad0/variant/opensuse-15.6-5.3_opam-2.3 HEAD is now at 451f36ee Add new configuration options for diffing runs debug settings (from ocaml/opam:debian-12-ocaml-5.3-flambda@sha256:36b0564f1768719da7bdf896155176e645f1eed290256ed08bf88aedbed5db96) 2025-03-20 22:51.43 ---> using "0cceac30ed9dfa8d54c8dfb703526aecc2d1f25e09755ea19f6a9b3ce08944d1" from cache /: (comment debian-12-5.3+flambda_opam-2.3) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (workdir /src) /src: (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) 2025-03-20 22:51.43 ---> using "b6cc72d0b69338afee388438c163da01e1509537d9db800516f8d6f84e0ff0f0" from cache /src: (run (shell "opam init --reinit -ni")) Configuring from /home/opam/.opamrc and then from built-in defaults. Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. This development version of opam requires an update to the layout of /home/opam/.opam from version 2.0 to version 2.2, which can't be reverted. You may want to back it up before going further. Continue? [y/n] y [NOTE] The 'jobs' option was reset, its value was 39 and its new value will vary according to the current number of cores on your machine. You can restore the fixed value using: opam option jobs=39 --global Format upgrade done. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [ERROR] Could not update repository "opam-repository-archive": "/usr/bin/git fetch -q" exited with code 128 "fatal: unable to access 'https://github.com/ocaml/opam-repository-archive/': Could not resolve host: github.com" [default] synchronised from file:///home/opam/opam-repository 2025-03-20 22:51.43 ---> using "9579671be2547253d961834dbf99a2617c3043d50341e16980a14c2b7946d157" from cache /src: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) Linux 5.15.0-134-generic The OCaml toplevel, version 5.3.0 2.3.0 2025-03-20 22:51.43 ---> using "21f2427316ecd3b2b1d06245ccc85b94bbe86b75b44d925374a8e2c678f4916d" from cache /src: (workdir /src) /src: (run (shell "sudo chown opam /src")) 2025-03-20 22:51.43 ---> using "a9c2b183ae9ffb50c46a6bddc4bdbd4bc2a49a6d4fa95d31a0495ab1d51f7cf3" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 || git fetch origin master) && git reset -q --hard 4e25d0cf5f38cf58d1904bbb48f62ccd9c48f785 && git log --no-decorate -n1 --oneline && opam update -u")) From https://github.com/ocaml/opam-repository * branch master -> FETCH_HEAD 862a7640b1..6cf83229dd master -> origin/master 4e25d0cf5f Merge pull request #27651 from lukstafi/opam-publish-ppx_minidebug.2.1.0 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [opam-repository-archive] synchronised from git+https://github.com/ocaml/opam-repository-archive [default] synchronised from file:///home/opam/opam-repository Everything as up-to-date as possible (run with --verbose to show unavailable upgrades). However, you may "opam upgrade" these packages explicitly, which will ask permission to downgrade or uninstall the conflicting packages. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-03-20 22:51.43 ---> using "af53f33f5b819debc22b733209c9b4785d8a42ec3a79f8a616d660e6188b4b5a" from cache /src: (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) 2025-03-20 22:51.43 ---> using "b323d96ffeacb25bfe6adb5928a31ac9ec899d21475a20216b0092dd5e664a7b" from cache /src: (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) [neural_nets_lib.dev] synchronised (file:///src) neural_nets_lib is now pinned to file:///src (version dev) [arrayjit.dev] synchronised (file:///src) arrayjit is now pinned to file:///src (version dev) 2025-03-20 22:51.43 ---> using "5f054a69009203df3f20573d4db5ccf09ae607dbf7085fcf65509d3b4f430b0c" from cache /src: (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) 2025-03-20 22:51.43 ---> using "bb5d4cd51cb16649a0b1a07c13f42656728e057e546d3328dcdcc17c1e452017" from cache /src: (env DEPS "angstrom.0.16.1 backoff.0.1.1 base.v0.17.1 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.17.2 dune-configurator.3.17.2 fieldslib.v0.17.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-syntax-shims.1.0.0 ocaml-variants.5.3.0+options ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.1.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") /src: (env CI true) /src: (env OCAMLCI true) /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) + /usr/bin/sudo "apt-get" "update" - Get:1 http://deb.debian.org/debian bookworm InRelease [151 kB] - Get:2 http://deb.debian.org/debian bookworm-updates InRelease [55.4 kB] - Get:3 http://deb.debian.org/debian-security bookworm-security InRelease [48.0 kB] - Get:4 http://deb.debian.org/debian bookworm/main amd64 Packages [8792 kB] - Get:5 http://deb.debian.org/debian-security bookworm-security/main amd64 Packages [249 kB] - Fetched 9296 kB in 1s (7064 kB/s) - Reading package lists... <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [arrayjit.dev] synchronised (file:///src) [neural_nets_lib.dev] synchronised (file:///src) [NOTE] Package ocaml-variants is already installed (current version is 5.3.0+options). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following system packages will first need to be installed: libffi-dev pkg-config <><> Handling external dependencies <><><><><><><><><><><><><><><><><><><><><><> + /usr/bin/sudo "apt-get" "install" "-qq" "-yy" "libffi-dev" "pkg-config" - debconf: delaying package configuration, since apt-utils is not installed - Selecting previously unselected package libffi-dev:amd64. - (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 18776 files and directories currently installed.) - Preparing to unpack .../libffi-dev_3.4.4-1_amd64.deb ... - Unpacking libffi-dev:amd64 (3.4.4-1) ... - Selecting previously unselected package libpkgconf3:amd64. - Preparing to unpack .../libpkgconf3_1.8.1-1_amd64.deb ... - Unpacking libpkgconf3:amd64 (1.8.1-1) ... - Selecting previously unselected package pkgconf-bin. - Preparing to unpack .../pkgconf-bin_1.8.1-1_amd64.deb ... - Unpacking pkgconf-bin (1.8.1-1) ... - Selecting previously unselected package pkgconf:amd64. - Preparing to unpack .../pkgconf_1.8.1-1_amd64.deb ... - Unpacking pkgconf:amd64 (1.8.1-1) ... - Selecting previously unselected package pkg-config:amd64. - Preparing to unpack .../pkg-config_1.8.1-1_amd64.deb ... - Unpacking pkg-config:amd64 (1.8.1-1) ... - Setting up libffi-dev:amd64 (3.4.4-1) ... - Setting up libpkgconf3:amd64 (1.8.1-1) ... - Setting up pkgconf-bin (1.8.1-1) ... - Setting up pkgconf:amd64 (1.8.1-1) ... - Setting up pkg-config:amd64 (1.8.1-1) ... - Processing triggers for libc-bin (2.36-9+deb12u9) ... 2025-03-20 22:51.43 ---> using "8238c45edf3cd1d100506951c4dcace83467710698b325f4e5379b732f1032f5" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) [NOTE] Package ocaml-variants is already installed (current version is 5.3.0+options). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following actions will be performed: === install 65 packages - install angstrom 0.16.1 - install backoff 0.1.1 - install base v0.17.1 - install bigarray-compat 1.1.0 - install bigstringaf 0.10.0 - install conf-libffi 2.0.0 - install conf-pkg-config 4 - install cppo 1.8.0 - install csexp 1.5.2 - install ctypes 0.23.0 - install ctypes-foreign 0.23.0 - install dune 3.17.2 - install dune-configurator 3.17.2 - install fieldslib v0.17.0 - install integers 0.7.0 - install jane-street-headers v0.17.0 - install jst-config v0.17.0 - install mtime 2.1.0 - install multicore-magic 2.3.1 - install num 1.5-1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.16.1 - install ocamlfind 1.9.8 - install parsexp v0.17.0 - install ppx_assert v0.17.0 - install ppx_base v0.17.0 - install ppx_cold v0.17.0 - install ppx_compare v0.17.0 - install ppx_derivers 1.2.1 - install ppx_deriving 6.0.3 - install ppx_enumerate v0.17.0 - install ppx_expect v0.17.2 - install ppx_fields_conv v0.17.0 - install ppx_globalize v0.17.0 - install ppx_hash v0.17.0 - install ppx_here v0.17.0 - install ppx_inline_test v0.17.0 - install ppx_minidebug 2.1.0 - install ppx_optcomp v0.17.0 - install ppx_sexp_conv v0.17.0 - install ppx_string v0.17.0 - install ppx_variants_conv v0.17.0 - install ppxlib 0.35.0 - install ppxlib_jane v0.17.2 - install printbox 0.12 - install printbox-ext-plot 0.12 - install printbox-html 0.12 - install printbox-md 0.12 - install printbox-text 0.12 - install ptime 1.2.0 - install re 1.12.0 - install saturn_lockfree 0.5.0 - install seq base - install sexplib v0.17.0 - install sexplib0 v0.17.0 - install stdio v0.17.0 - install stdlib-shims 0.3.0 - install time_now v0.17.0 - install topkg 1.0.8 - install tyxml 4.6.0 - install uucp 16.0.0 - install uutf 1.0.4 - install variantslib v0.17.0 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved backoff.0.1.1 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved bigarray-compat.1.1.0 (cached) -> retrieved base.v0.17.1 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved cppo.1.8.0 (cached) -> installed conf-pkg-config.4 -> retrieved csexp.1.5.2 (cached) -> retrieved ctypes.0.23.0, ctypes-foreign.0.23.0 (cached) -> installed conf-libffi.2.0.0 -> retrieved fieldslib.v0.17.0 (cached) -> retrieved integers.0.7.0 (cached) -> retrieved jane-street-headers.v0.17.0 (cached) -> retrieved jst-config.v0.17.0 (cached) -> retrieved mtime.2.1.0 (cached) -> retrieved multicore-magic.2.3.1 (cached) -> retrieved num.1.5-1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ocamlbuild.0.16.1 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved parsexp.v0.17.0 (cached) -> retrieved ppx_assert.v0.17.0 (cached) -> retrieved ppx_base.v0.17.0 (cached) -> retrieved ppx_cold.v0.17.0 (cached) -> retrieved ppx_compare.v0.17.0 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved ppx_enumerate.v0.17.0 (cached) -> retrieved ppx_deriving.6.0.3 (cached) -> retrieved dune.3.17.2, dune-configurator.3.17.2 (cached) -> installed num.1.5-1 -> retrieved ppx_expect.v0.17.2 (cached) -> retrieved ppx_fields_conv.v0.17.0 (cached) -> retrieved ppx_globalize.v0.17.0 (cached) -> retrieved ppx_hash.v0.17.0 (cached) -> retrieved ppx_here.v0.17.0 (cached) -> retrieved ppx_inline_test.v0.17.0 (cached) -> retrieved ppx_optcomp.v0.17.0 (cached) -> retrieved ppx_sexp_conv.v0.17.0 (cached) -> retrieved ppx_string.v0.17.0 (cached) -> retrieved ppx_variants_conv.v0.17.0 (cached) -> retrieved ppxlib_jane.v0.17.2 (cached) -> retrieved ppx_minidebug.2.1.0 (cached) -> retrieved ptime.1.2.0 (cached) -> retrieved re.1.12.0 (cached) -> retrieved ppxlib.0.35.0 (cached) -> retrieved seq.base (cached) -> installed seq.base -> retrieved saturn_lockfree.0.5.0 (cached) -> retrieved sexplib.v0.17.0 (cached) -> retrieved printbox.0.12, printbox-ext-plot.0.12, printbox-html.0.12, printbox-md.0.12, printbox-text.0.12 (cached) -> retrieved sexplib0.v0.17.0 (cached) -> retrieved stdio.v0.17.0 (cached) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved time_now.v0.17.0 (cached) -> retrieved topkg.1.0.8 (cached) -> retrieved tyxml.4.6.0 (cached) -> retrieved uutf.1.0.4 (cached) -> retrieved variantslib.v0.17.0 (cached) -> retrieved uucp.16.0.0 (cached) -> installed ocamlfind.1.9.8 -> installed ocamlbuild.0.16.1 -> installed topkg.1.0.8 -> installed uutf.1.0.4 -> installed mtime.2.1.0 -> installed ptime.1.2.0 -> installed dune.3.17.2 -> installed jane-street-headers.v0.17.0 -> installed ppx_derivers.1.2.1 -> installed backoff.0.1.1 -> installed csexp.1.5.2 -> installed bigarray-compat.1.1.0 -> installed cppo.1.8.0 -> installed multicore-magic.2.3.1 -> installed ocaml-syntax-shims.1.0.0 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed ocaml-compiler-libs.v0.17.0 -> installed printbox.0.12 -> installed re.1.12.0 -> installed sexplib0.v0.17.0 -> installed stdlib-shims.0.3.0 -> installed saturn_lockfree.0.5.0 -> installed integers.0.7.0 -> installed parsexp.v0.17.0 -> installed dune-configurator.3.17.2 -> installed bigstringaf.0.10.0 -> installed sexplib.v0.17.0 -> installed angstrom.0.16.1 -> installed tyxml.4.6.0 -> installed uucp.16.0.0 -> installed printbox-html.0.12 -> installed printbox-text.0.12 -> installed printbox-md.0.12 -> installed printbox-ext-plot.0.12 -> installed ctypes.0.23.0 -> installed base.v0.17.1 -> installed ctypes-foreign.0.23.0 -> installed fieldslib.v0.17.0 -> installed variantslib.v0.17.0 -> installed stdio.v0.17.0 -> installed ppxlib.0.35.0 -> installed ppx_optcomp.v0.17.0 -> installed ppxlib_jane.v0.17.2 -> installed ppx_cold.v0.17.0 -> installed ppx_here.v0.17.0 -> installed ppx_variants_conv.v0.17.0 -> installed ppx_fields_conv.v0.17.0 -> installed ppx_enumerate.v0.17.0 -> installed ppx_globalize.v0.17.0 -> installed ppx_deriving.6.0.3 -> installed ppx_compare.v0.17.0 -> installed ppx_sexp_conv.v0.17.0 -> installed ppx_hash.v0.17.0 -> installed ppx_assert.v0.17.0 -> installed ppx_minidebug.2.1.0 -> installed ppx_base.v0.17.0 -> installed jst-config.v0.17.0 -> installed ppx_string.v0.17.0 -> installed time_now.v0.17.0 -> installed ppx_inline_test.v0.17.0 -> installed ppx_expect.v0.17.2 Done. # To update the current shell environment, run: eval $(opam env) 2025-03-20 22:51.43 ---> using "d9717d0a09483614d783e8e0c2ae1544a95a9829a9989513258fe2f9a483585c" from cache /src: (copy (src .) (dst /src)) 2025-03-20 22:51.44 ---> saved as "3deb76823dfd90d0bae389082cb0c8c916723ea7f2dac02a1817a1b3bf8be5bf" /src: (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test_ppx && ./test_ppx_op.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test_ppx && ./test_ppx_op_expected.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition '' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a5169fc27086520a1b84bc4c768668a3/default/test/ocannl_config.' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Retrieving commandline, environment, or config file variable ocannl_log_level' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition 'Found 0, in the config file' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition primitive_ops.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition zero2hero_1of7.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition hello_world_op.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition einsum_trivia.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition micrograd_demo.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test && .tutorials.inline-tests/inline_test_runner_tutorials.exe inline-test-runner tutorials -partition moons_demo_parallel.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/a18a1d6c1ca5529736aa12ed5cdc5fcf/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test && ./moons_demo_parallel_run.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file ("Set log_level to" 1) └─{orphaned from #2} Retrieving commandline, environment, or config file variable ocannl_backend Found cc, in the config file Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Retrieving commandline, environment, or config file variable ocannl_never_capture_stdout Not found, using default false Batch=59, step=60, lr=0.200000, batch loss=23.609453, epoch loss=23.609453 Batch=119, step=120, lr=0.199750, batch loss=8.539634, epoch loss=32.149087 Batch=179, step=180, lr=0.199500, batch loss=2.626295, epoch loss=34.775382 Batch=239, step=240, lr=0.199250, batch loss=0.849657, epoch loss=35.625039 Batch=299, step=300, lr=0.199000, batch loss=1.447177, epoch loss=37.072216 Batch=359, step=360, lr=0.198750, batch loss=1.329296, epoch loss=38.401512 Batch=419, step=420, lr=0.198250, batch loss=0.618569, epoch loss=39.020081 Batch=479, step=480, lr=0.198250, batch loss=0.821468, epoch loss=39.841549 Batch=539, step=540, lr=0.197750, batch loss=0.691338, epoch loss=40.532887 Batch=599, step=600, lr=0.197750, batch loss=1.061674, epoch loss=41.594561 Batch=659, step=660, lr=0.197250, batch loss=0.483747, epoch loss=42.078308 Batch=719, step=720, lr=0.197250, batch loss=0.411391, epoch loss=42.489699 Batch=779, step=780, lr=0.197000, batch loss=0.469952, epoch loss=42.959651 Batch=839, step=840, lr=0.196750, batch loss=0.446946, epoch loss=43.406597 Batch=899, step=900, lr=0.196500, batch loss=0.382739, epoch loss=43.789336 Batch=959, step=960, lr=0.196250, batch loss=0.244405, epoch loss=44.033740 Batch=1019, step=1020, lr=0.195750, batch loss=0.468002, epoch loss=44.501743 Batch=1079, step=1080, lr=0.195750, batch loss=0.247297, epoch loss=44.749040 Batch=1139, step=1140, lr=0.195250, batch loss=0.318210, epoch loss=45.067250 Batch=1199, step=1200, lr=0.195250, batch loss=0.263269, epoch loss=45.330520 Epoch=0, step=1200, lr=0.195250, epoch loss=45.330520 Batch=59, step=1260, lr=0.195000, batch loss=0.261711, epoch loss=0.261711 Batch=119, step=1320, lr=0.194750, batch loss=0.205535, epoch loss=0.467246 Batch=179, step=1380, lr=0.194500, batch loss=0.243626, epoch loss=0.710872 Batch=239, step=1440, lr=0.194250, batch loss=0.348167, epoch loss=1.059039 Batch=299, step=1500, lr=0.194000, batch loss=0.253643, epoch loss=1.312682 Batch=359, step=1560, lr=0.193750, batch loss=0.316382, epoch loss=1.629064 Batch=419, step=1620, lr=0.193500, batch loss=0.309769, epoch loss=1.938834 Batch=479, step=1680, lr=0.193000, batch loss=0.275541, epoch loss=2.214374 Batch=539, step=1740, lr=0.192750, batch loss=0.210743, epoch loss=2.425117 Batch=599, step=1800, lr=0.192500, batch loss=0.250636, epoch loss=2.675753 Batch=659, step=1860, lr=0.192250, batch loss=0.369837, epoch loss=3.045590 Batch=719, step=1920, lr=0.192000, batch loss=0.364119, epoch loss=3.409709 Batch=779, step=1980, lr=0.192000, batch loss=0.383350, epoch loss=3.793060 Batch=839, step=2040, lr=0.191500, batch loss=0.350229, epoch loss=4.143289 Batch=899, step=2100, lr=0.191250, batch loss=0.322385, epoch loss=4.465674 Batch=959, step=2160, lr=0.191000, batch loss=0.257538, epoch loss=4.723212 Batch=1019, step=2220, lr=0.190750, batch loss=0.388663, epoch loss=5.111875 Batch=1079, step=2280, lr=0.190500, batch loss=0.206070, epoch loss=5.317945 Batch=1139, step=2340, lr=0.190500, batch loss=0.259073, epoch loss=5.577018 Batch=1199, step=2400, lr=0.190250, batch loss=0.211142, epoch loss=5.788161 Epoch=1, step=2400, lr=0.190250, epoch loss=5.788161 Batch=59, step=2460, lr=0.189750, batch loss=0.234605, epoch loss=0.234605 Batch=119, step=2520, lr=0.189750, batch loss=0.194815, epoch loss=0.429420 Batch=179, step=2580, lr=0.189500, batch loss=0.220905, epoch loss=0.650325 Batch=239, step=2640, lr=0.189250, batch loss=0.327962, epoch loss=0.978287 Batch=299, step=2700, lr=0.189000, batch loss=0.203519, epoch loss=1.181806 Batch=359, step=2760, lr=0.188500, batch loss=0.288380, epoch loss=1.470185 Batch=419, step=2820, lr=0.188500, batch loss=0.280302, epoch loss=1.750488 Batch=479, step=2880, lr=0.188250, batch loss=0.252071, epoch loss=2.002559 Batch=539, step=2940, lr=0.188000, batch loss=0.192274, epoch loss=2.194833 Batch=599, step=3000, lr=0.187750, batch loss=0.225259, epoch loss=2.420093 Batch=659, step=3060, lr=0.187250, batch loss=0.331922, epoch loss=2.752014 Batch=719, step=3120, lr=0.187250, batch loss=0.331714, epoch loss=3.083728 Batch=779, step=3180, lr=0.187000, batch loss=0.358292, epoch loss=3.442020 Batch=839, step=3240, lr=0.186750, batch loss=0.324933, epoch loss=3.766953 Batch=899, step=3300, lr=0.186500, batch loss=0.293083, epoch loss=4.060037 Batch=959, step=3360, lr=0.186250, batch loss=0.236359, epoch loss=4.296396 Batch=1019, step=3420, lr=0.186000, batch loss=0.337814, epoch loss=4.634210 Batch=1079, step=3480, lr=0.185750, batch loss=0.191853, epoch loss=4.826063 Batch=1139, step=3540, lr=0.185250, batch loss=0.232268, epoch loss=5.058332 Batch=1199, step=3600, lr=0.185000, batch loss=0.198783, epoch loss=5.257114 Epoch=2, step=3600, lr=0.185000, epoch loss=5.257114 Batch=59, step=3660, lr=0.185000, batch loss=0.226869, epoch loss=0.226869 Batch=119, step=3720, lr=0.184500, batch loss=0.191486, epoch loss=0.418355 Batch=179, step=3780, lr=0.184250, batch loss=0.211178, epoch loss=0.629533 Batch=239, step=3840, lr=0.184250, batch loss=0.316553, epoch loss=0.946085 Batch=299, step=3900, lr=0.184000, batch loss=0.202986, epoch loss=1.149072 Batch=359, step=3960, lr=0.183750, batch loss=0.282943, epoch loss=1.432014 Batch=419, step=4020, lr=0.183500, batch loss=0.271976, epoch loss=1.703991 Batch=479, step=4080, lr=0.183250, batch loss=0.249648, epoch loss=1.953639 Batch=539, step=4140, lr=0.183000, batch loss=0.194471, epoch loss=2.148110 Batch=599, step=4200, lr=0.182750, batch loss=0.243266, epoch loss=2.391376 Batch=659, step=4260, lr=0.182500, batch loss=0.328290, epoch loss=2.719665 Batch=719, step=4320, lr=0.182250, batch loss=0.329012, epoch loss=3.048677 Batch=779, step=4380, lr=0.182000, batch loss=0.350499, epoch loss=3.399176 Batch=839, step=4440, lr=0.181750, batch loss=0.318996, epoch loss=3.718172 Batch=899, step=4500, lr=0.181500, batch loss=0.290733, epoch loss=4.008905 Batch=959, step=4560, lr=0.181250, batch loss=0.244639, epoch loss=4.253544 Batch=1019, step=4620, lr=0.181000, batch loss=0.339256, epoch loss=4.592800 Batch=1079, step=4680, lr=0.180750, batch loss=0.222381, epoch loss=4.815180 Batch=1139, step=4740, lr=0.180500, batch loss=0.246629, epoch loss=5.061809 Batch=1199, step=4800, lr=0.180250, batch loss=0.191209, epoch loss=5.253019 Epoch=3, step=4800, lr=0.180250, epoch loss=5.253019 Batch=59, step=4860, lr=0.180000, batch loss=0.226537, epoch loss=0.226537 Batch=119, step=4920, lr=0.179750, batch loss=0.190731, epoch loss=0.417267 Batch=179, step=4980, lr=0.179500, batch loss=0.206210, epoch loss=0.623477 Batch=239, step=5040, lr=0.179250, batch loss=0.307476, epoch loss=0.930953 Batch=299, step=5100, lr=0.179000, batch loss=0.207250, epoch loss=1.138204 Batch=359, step=5160, lr=0.178750, batch loss=0.276754, epoch loss=1.414958 Batch=419, step=5220, lr=0.178500, batch loss=0.280100, epoch loss=1.695057 Batch=479, step=5280, lr=0.178250, batch loss=0.255888, epoch loss=1.950945 Batch=539, step=5340, lr=0.178000, batch loss=0.189935, epoch loss=2.140880 Batch=599, step=5400, lr=0.177750, batch loss=0.229680, epoch loss=2.370560 Batch=659, step=5460, lr=0.177500, batch loss=0.324831, epoch loss=2.695391 Batch=719, step=5520, lr=0.177250, batch loss=0.325692, epoch loss=3.021083 Batch=779, step=5580, lr=0.177000, batch loss=0.344705, epoch loss=3.365789 Batch=839, step=5640, lr=0.176750, batch loss=0.308637, epoch loss=3.674426 Batch=899, step=5700, lr=0.176500, batch loss=0.272375, epoch loss=3.946800 Batch=959, step=5760, lr=0.176000, batch loss=0.215247, epoch loss=4.162047 Batch=1019, step=5820, lr=0.176000, batch loss=0.333892, epoch loss=4.495939 Batch=1079, step=5880, lr=0.175750, batch loss=0.178284, epoch loss=4.674223 Batch=1139, step=5940, lr=0.175250, batch loss=0.217670, epoch loss=4.891894 Batch=1199, step=6000, lr=0.175250, batch loss=0.191087, epoch loss=5.082980 Epoch=4, step=6000, lr=0.175250, epoch loss=5.082980 Batch=59, step=6060, lr=0.175000, batch loss=0.234431, epoch loss=0.234431 Batch=119, step=6120, lr=0.174750, batch loss=0.190856, epoch loss=0.425287 Batch=179, step=6180, lr=0.174500, batch loss=0.200388, epoch loss=0.625676 Batch=239, step=6240, lr=0.174250, batch loss=0.299731, epoch loss=0.925407 Batch=299, step=6300, lr=0.174000, batch loss=0.213999, epoch loss=1.139406 Batch=359, step=6360, lr=0.173750, batch loss=0.271092, epoch loss=1.410498 Batch=419, step=6420, lr=0.173500, batch loss=0.263656, epoch loss=1.674154 Batch=479, step=6480, lr=0.173250, batch loss=0.245389, epoch loss=1.919543 Batch=539, step=6540, lr=0.172750, batch loss=0.191190, epoch loss=2.110733 Batch=599, step=6600, lr=0.172750, batch loss=0.230942, epoch loss=2.341675 Batch=659, step=6660, lr=0.172500, batch loss=0.313851, epoch loss=2.655526 Batch=719, step=6720, lr=0.172250, batch loss=0.315614, epoch loss=2.971140 Batch=779, step=6780, lr=0.172000, batch loss=0.329532, epoch loss=3.300672 Batch=839, step=6840, lr=0.171750, batch loss=0.305594, epoch loss=3.606265 Batch=899, step=6900, lr=0.171500, batch loss=0.264150, epoch loss=3.870415 Batch=959, step=6960, lr=0.171250, batch loss=0.213983, epoch loss=4.084398 Batch=1019, step=7020, lr=0.170750, batch loss=0.321472, epoch loss=4.405870 Batch=1079, step=7080, lr=0.170500, batch loss=0.178471, epoch loss=4.584341 Batch=1139, step=7140, lr=0.170500, batch loss=0.214387, epoch loss=4.798728 Batch=1199, step=7200, lr=0.170250, batch loss=0.184145, epoch loss=4.982873 Epoch=5, step=7200, lr=0.170250, epoch loss=4.982873 Batch=59, step=7260, lr=0.170000, batch loss=0.236862, epoch loss=0.236862 Batch=119, step=7320, lr=0.169750, batch loss=0.181391, epoch loss=0.418253 Batch=179, step=7380, lr=0.169500, batch loss=0.194244, epoch loss=0.612496 Batch=239, step=7440, lr=0.169250, batch loss=0.289191, epoch loss=0.901688 Batch=299, step=7500, lr=0.169000, batch loss=0.206409, epoch loss=1.108096 Batch=359, step=7560, lr=0.168750, batch loss=0.261313, epoch loss=1.369409 Batch=419, step=7620, lr=0.168500, batch loss=0.254831, epoch loss=1.624241 Batch=479, step=7680, lr=0.168250, batch loss=0.237646, epoch loss=1.861887 Batch=539, step=7740, lr=0.168000, batch loss=0.186468, epoch loss=2.048355 Batch=599, step=7800, lr=0.167750, batch loss=0.226717, epoch loss=2.275072 Batch=659, step=7860, lr=0.167500, batch loss=0.304045, epoch loss=2.579118 Batch=719, step=7920, lr=0.167250, batch loss=0.306024, epoch loss=2.885141 Batch=779, step=7980, lr=0.167000, batch loss=0.319524, epoch loss=3.204666 Batch=839, step=8040, lr=0.166750, batch loss=0.295966, epoch loss=3.500632 Batch=899, step=8100, lr=0.166500, batch loss=0.258072, epoch loss=3.758704 Batch=959, step=8160, lr=0.166250, batch loss=0.207727, epoch loss=3.966431 Batch=1019, step=8220, lr=0.166000, batch loss=0.318099, epoch loss=4.284530 Batch=1079, step=8280, lr=0.165750, batch loss=0.179365, epoch loss=4.463895 Batch=1139, step=8340, lr=0.165500, batch loss=0.210344, epoch loss=4.674239 Batch=1199, step=8400, lr=0.165250, batch loss=0.172020, epoch loss=4.846259 Epoch=6, step=8400, lr=0.165250, epoch loss=4.846259 Batch=59, step=8460, lr=0.165000, batch loss=0.210697, epoch loss=0.210697 Batch=119, step=8520, lr=0.164750, batch loss=0.177198, epoch loss=0.387896 Batch=179, step=8580, lr=0.164500, batch loss=0.188247, epoch loss=0.576143 Batch=239, step=8640, lr=0.164250, batch loss=0.276793, epoch loss=0.852935 Batch=299, step=8700, lr=0.164000, batch loss=0.195885, epoch loss=1.048820 Batch=359, step=8760, lr=0.163750, batch loss=0.253633, epoch loss=1.302453 Batch=419, step=8820, lr=0.163500, batch loss=0.243224, epoch loss=1.545677 Batch=479, step=8880, lr=0.163250, batch loss=0.228489, epoch loss=1.774166 Batch=539, step=8940, lr=0.163000, batch loss=0.176547, epoch loss=1.950713 Batch=599, step=9000, lr=0.162750, batch loss=0.217929, epoch loss=2.168642 Batch=659, step=9060, lr=0.162500, batch loss=0.292971, epoch loss=2.461613 Batch=719, step=9120, lr=0.162250, batch loss=0.296692, epoch loss=2.758305 Batch=779, step=9180, lr=0.162000, batch loss=0.313507, epoch loss=3.071812 Batch=839, step=9240, lr=0.161750, batch loss=0.281191, epoch loss=3.353004 Batch=899, step=9300, lr=0.161500, batch loss=0.250645, epoch loss=3.603649 Batch=959, step=9360, lr=0.161000, batch loss=0.190043, epoch loss=3.793691 Batch=1019, step=9420, lr=0.161000, batch loss=0.314963, epoch loss=4.108654 Batch=1079, step=9480, lr=0.160750, batch loss=0.187844, epoch loss=4.296498 Batch=1139, step=9540, lr=0.160250, batch loss=0.205163, epoch loss=4.501661 Batch=1199, step=9600, lr=0.160250, batch loss=0.165318, epoch loss=4.666979 Epoch=7, step=9600, lr=0.160250, epoch loss=4.666979 Batch=59, step=9660, lr=0.160000, batch loss=0.194733, epoch loss=0.194733 Batch=119, step=9720, lr=0.159750, batch loss=0.165918, epoch loss=0.360651 Batch=179, step=9780, lr=0.159500, batch loss=0.178974, epoch loss=0.539624 Batch=239, step=9840, lr=0.159250, batch loss=0.262529, epoch loss=0.802153 Batch=299, step=9900, lr=0.159000, batch loss=0.183160, epoch loss=0.985313 Batch=359, step=9960, lr=0.158750, batch loss=0.238061, epoch loss=1.223374 Batch=419, step=10020, lr=0.158500, batch loss=0.231954, epoch loss=1.455328 Batch=479, step=10080, lr=0.158250, batch loss=0.218934, epoch loss=1.674262 Batch=539, step=10140, lr=0.158000, batch loss=0.168817, epoch loss=1.843079 Batch=599, step=10200, lr=0.157750, batch loss=0.201854, epoch loss=2.044933 Batch=659, step=10260, lr=0.157500, batch loss=0.279949, epoch loss=2.324882 Batch=719, step=10320, lr=0.157250, batch loss=0.278707, epoch loss=2.603589 Batch=779, step=10380, lr=0.157000, batch loss=0.298932, epoch loss=2.902521 Batch=839, step=10440, lr=0.156750, batch loss=0.269651, epoch loss=3.172171 Batch=899, step=10500, lr=0.156500, batch loss=0.239258, epoch loss=3.411429 Batch=959, step=10560, lr=0.156250, batch loss=0.200327, epoch loss=3.611756 Batch=1019, step=10620, lr=0.156000, batch loss=0.276518, epoch loss=3.888274 Batch=1079, step=10680, lr=0.155750, batch loss=0.150962, epoch loss=4.039236 Batch=1139, step=10740, lr=0.155500, batch loss=0.182245, epoch loss=4.221481 Batch=1199, step=10800, lr=0.155250, batch loss=0.154820, epoch loss=4.376301 Epoch=8, step=10800, lr=0.155250, epoch loss=4.376301 Batch=59, step=10860, lr=0.155000, batch loss=0.192442, epoch loss=0.192442 Batch=119, step=10920, lr=0.154750, batch loss=0.161531, epoch loss=0.353974 Batch=179, step=10980, lr=0.154500, batch loss=0.165293, epoch loss=0.519267 Batch=239, step=11040, lr=0.154250, batch loss=0.241457, epoch loss=0.760724 Batch=299, step=11100, lr=0.154000, batch loss=0.166818, epoch loss=0.927542 Batch=359, step=11160, lr=0.153750, batch loss=0.224019, epoch loss=1.151561 Batch=419, step=11220, lr=0.153500, batch loss=0.227332, epoch loss=1.378893 Batch=479, step=11280, lr=0.153250, batch loss=0.202701, epoch loss=1.581595 Batch=539, step=11340, lr=0.153000, batch loss=0.158771, epoch loss=1.740365 Batch=599, step=11400, lr=0.152750, batch loss=0.183592, epoch loss=1.923957 Batch=659, step=11460, lr=0.152500, batch loss=0.261464, epoch loss=2.185422 Batch=719, step=11520, lr=0.152250, batch loss=0.255061, epoch loss=2.440483 Batch=779, step=11580, lr=0.152000, batch loss=0.267730, epoch loss=2.708213 Batch=839, step=11640, lr=0.151750, batch loss=0.256593, epoch loss=2.964806 Batch=899, step=11700, lr=0.151500, batch loss=0.230234, epoch loss=3.195040 Batch=959, step=11760, lr=0.151250, batch loss=0.167987, epoch loss=3.363027 Batch=1019, step=11820, lr=0.151000, batch loss=0.262963, epoch loss=3.625990 Batch=1079, step=11880, lr=0.150750, batch loss=0.147507, epoch loss=3.773497 Batch=1139, step=11940, lr=0.150500, batch loss=0.186796, epoch loss=3.960293 Batch=1199, step=12000, lr=0.150250, batch loss=0.137960, epoch loss=4.098253 Epoch=9, step=12000, lr=0.150250, epoch loss=4.098253 Batch=59, step=12060, lr=0.150000, batch loss=0.161891, epoch loss=0.161891 Batch=119, step=12120, lr=0.149750, batch loss=0.134320, epoch loss=0.296211 Batch=179, step=12180, lr=0.149500, batch loss=0.149976, epoch loss=0.446186 Batch=239, step=12240, lr=0.149250, batch loss=0.218265, epoch loss=0.664451 Batch=299, step=12300, lr=0.149000, batch loss=0.139025, epoch loss=0.803476 Batch=359, step=12360, lr=0.148750, batch loss=0.198307, epoch loss=1.001783 Batch=419, step=12420, lr=0.148500, batch loss=0.202380, epoch loss=1.204163 Batch=479, step=12480, lr=0.148250, batch loss=0.173297, epoch loss=1.377461 Batch=539, step=12540, lr=0.148000, batch loss=0.144713, epoch loss=1.522174 Batch=599, step=12600, lr=0.147750, batch loss=0.146899, epoch loss=1.669073 Batch=659, step=12660, lr=0.147500, batch loss=0.214703, epoch loss=1.883776 Batch=719, step=12720, lr=0.147250, batch loss=0.214724, epoch loss=2.098500 Batch=779, step=12780, lr=0.147000, batch loss=0.239165, epoch loss=2.337665 Batch=839, step=12840, lr=0.146500, batch loss=0.224280, epoch loss=2.561945 Batch=899, step=12900, lr=0.146500, batch loss=0.187053, epoch loss=2.748998 Batch=959, step=12960, lr=0.146250, batch loss=0.171748, epoch loss=2.920746 Batch=1019, step=13020, lr=0.146000, batch loss=0.312399, epoch loss=3.233146 Batch=1079, step=13080, lr=0.145750, batch loss=0.109275, epoch loss=3.342421 Batch=1139, step=13140, lr=0.145500, batch loss=0.151086, epoch loss=3.493507 Batch=1199, step=13200, lr=0.145250, batch loss=0.116556, epoch loss=3.610063 Epoch=10, step=13200, lr=0.145250, epoch loss=3.610063 Batch=59, step=13260, lr=0.145000, batch loss=0.143305, epoch loss=0.143305 Batch=119, step=13320, lr=0.144750, batch loss=0.116704, epoch loss=0.260009 Batch=179, step=13380, lr=0.144500, batch loss=0.129161, epoch loss=0.389170 Batch=239, step=13440, lr=0.144000, batch loss=0.188955, epoch loss=0.578125 Batch=299, step=13500, lr=0.144000, batch loss=0.118113, epoch loss=0.696238 Batch=359, step=13560, lr=0.143750, batch loss=0.162032, epoch loss=0.858270 Batch=419, step=13620, lr=0.143500, batch loss=0.156248, epoch loss=1.014518 Batch=479, step=13680, lr=0.143250, batch loss=0.146052, epoch loss=1.160570 Batch=539, step=13740, lr=0.143000, batch loss=0.114591, epoch loss=1.275161 Batch=599, step=13800, lr=0.142750, batch loss=0.118791, epoch loss=1.393952 Batch=659, step=13860, lr=0.142500, batch loss=0.179359, epoch loss=1.573312 Batch=719, step=13920, lr=0.142250, batch loss=0.194222, epoch loss=1.767533 Batch=779, step=13980, lr=0.142000, batch loss=0.220600, epoch loss=1.988134 Batch=839, step=14040, lr=0.141750, batch loss=0.203874, epoch loss=2.192007 Batch=899, step=14100, lr=0.141500, batch loss=0.221813, epoch loss=2.413820 Batch=959, step=14160, lr=0.141250, batch loss=0.096688, epoch loss=2.510509 Batch=1019, step=14220, lr=0.141000, batch loss=0.201918, epoch loss=2.712427 Batch=1079, step=14280, lr=0.140750, batch loss=0.071943, epoch loss=2.784370 Batch=1139, step=14340, lr=0.140500, batch loss=0.113262, epoch loss=2.897632 Batch=1199, step=14400, lr=0.140250, batch loss=0.084826, epoch loss=2.982459 Epoch=11, step=14400, lr=0.140250, epoch loss=2.982459 Batch=59, step=14460, lr=0.140000, batch loss=0.102785, epoch loss=0.102785 Batch=119, step=14520, lr=0.139500, batch loss=0.098745, epoch loss=0.201530 Batch=179, step=14580, lr=0.139500, batch loss=0.097513, epoch loss=0.299043 Batch=239, step=14640, lr=0.139250, batch loss=0.142294, epoch loss=0.441337 Batch=299, step=14700, lr=0.139000, batch loss=0.076597, epoch loss=0.517934 Batch=359, step=14760, lr=0.138750, batch loss=0.116652, epoch loss=0.634586 Batch=419, step=14820, lr=0.138500, batch loss=0.128131, epoch loss=0.762717 Batch=479, step=14880, lr=0.138250, batch loss=0.099990, epoch loss=0.862707 Batch=539, step=14940, lr=0.138000, batch loss=0.118237, epoch loss=0.980944 Batch=599, step=15000, lr=0.137500, batch loss=0.082772, epoch loss=1.063716 Batch=659, step=15060, lr=0.137500, batch loss=0.125445, epoch loss=1.189161 Batch=719, step=15120, lr=0.137250, batch loss=0.117734, epoch loss=1.306895 Batch=779, step=15180, lr=0.137000, batch loss=0.129597, epoch loss=1.436492 Batch=839, step=15240, lr=0.136750, batch loss=0.158686, epoch loss=1.595178 Batch=899, step=15300, lr=0.136500, batch loss=0.282028, epoch loss=1.877206 Batch=959, step=15360, lr=0.136250, batch loss=0.060728, epoch loss=1.937934 Batch=1019, step=15420, lr=0.136000, batch loss=0.129784, epoch loss=2.067718 Batch=1079, step=15480, lr=0.135750, batch loss=0.046758, epoch loss=2.114476 Batch=1139, step=15540, lr=0.135500, batch loss=0.099750, epoch loss=2.214226 Batch=1199, step=15600, lr=0.135250, batch loss=0.058935, epoch loss=2.273161 Epoch=12, step=15600, lr=0.135250, epoch loss=2.273161 Batch=59, step=15660, lr=0.135000, batch loss=0.068750, epoch loss=0.068750 Batch=119, step=15720, lr=0.134750, batch loss=0.089363, epoch loss=0.158113 Batch=179, step=15780, lr=0.134500, batch loss=0.079266, epoch loss=0.237380 Batch=239, step=15840, lr=0.134250, batch loss=0.088670, epoch loss=0.326050 Batch=299, step=15900, lr=0.134000, batch loss=0.041461, epoch loss=0.367511 Batch=359, step=15960, lr=0.133750, batch loss=0.080137, epoch loss=0.447647 Batch=419, step=16020, lr=0.133500, batch loss=0.112654, epoch loss=0.560301 Batch=479, step=16080, lr=0.133250, batch loss=0.046373, epoch loss=0.606674 Batch=539, step=16140, lr=0.132750, batch loss=0.044367, epoch loss=0.651041 Batch=599, step=16200, lr=0.132750, batch loss=0.052278, epoch loss=0.703319 Batch=659, step=16260, lr=0.132500, batch loss=0.077321, epoch loss=0.780640 Batch=719, step=16320, lr=0.132250, batch loss=0.096391, epoch loss=0.877031 Batch=779, step=16380, lr=0.132000, batch loss=0.098028, epoch loss=0.975059 Batch=839, step=16440, lr=0.131750, batch loss=0.192589, epoch loss=1.167648 Batch=899, step=16500, lr=0.131500, batch loss=0.087367, epoch loss=1.255015 Batch=959, step=16560, lr=0.131250, batch loss=0.042444, epoch loss=1.297459 Batch=1019, step=16620, lr=0.131000, batch loss=0.052554, epoch loss=1.350013 Batch=1079, step=16680, lr=0.130750, batch loss=0.055245, epoch loss=1.405258 Batch=1139, step=16740, lr=0.130500, batch loss=0.104560, epoch loss=1.509817 Batch=1199, step=16800, lr=0.130250, batch loss=0.043053, epoch loss=1.552870 Epoch=13, step=16800, lr=0.130250, epoch loss=1.552870 Batch=59, step=16860, lr=0.130000, batch loss=0.035867, epoch loss=0.035867 Batch=119, step=16920, lr=0.129750, batch loss=0.038065, epoch loss=0.073932 Batch=179, step=16980, lr=0.129500, batch loss=0.042576, epoch loss=0.116509 Batch=239, step=17040, lr=0.129250, batch loss=0.056911, epoch loss=0.173420 Batch=299, step=17100, lr=0.129000, batch loss=0.018191, epoch loss=0.191610 Batch=359, step=17160, lr=0.128750, batch loss=0.041685, epoch loss=0.233295 Batch=419, step=17220, lr=0.128500, batch loss=0.043524, epoch loss=0.276819 Batch=479, step=17280, lr=0.128250, batch loss=0.040520, epoch loss=0.317339 Batch=539, step=17340, lr=0.128000, batch loss=0.097386, epoch loss=0.414725 Batch=599, step=17400, lr=0.127750, batch loss=0.040585, epoch loss=0.455310 Batch=659, step=17460, lr=0.127500, batch loss=0.048914, epoch loss=0.504224 Batch=719, step=17520, lr=0.127250, batch loss=0.063189, epoch loss=0.567413 Batch=779, step=17580, lr=0.127000, batch loss=0.118053, epoch loss=0.685466 Batch=839, step=17640, lr=0.126750, batch loss=0.070619, epoch loss=0.756084 Batch=899, step=17700, lr=0.126500, batch loss=0.079750, epoch loss=0.835834 Batch=959, step=17760, lr=0.126250, batch loss=0.036020, epoch loss=0.871855 Batch=1019, step=17820, lr=0.126000, batch loss=0.072351, epoch loss=0.944206 Batch=1079, step=17880, lr=0.125750, batch loss=0.019807, epoch loss=0.964013 Batch=1139, step=17940, lr=0.125500, batch loss=0.037874, epoch loss=1.001887 Batch=1199, step=18000, lr=0.125250, batch loss=0.016738, epoch loss=1.018625 Epoch=14, step=18000, lr=0.125250, epoch loss=1.018625 Batch=59, step=18060, lr=0.124750, batch loss=0.010890, epoch loss=0.010890 Batch=119, step=18120, lr=0.124750, batch loss=0.020281, epoch loss=0.031171 Batch=179, step=18180, lr=0.124500, batch loss=0.033441, epoch loss=0.064612 Batch=239, step=18240, lr=0.124250, batch loss=0.036938, epoch loss=0.101550 Batch=299, step=18300, lr=0.124000, batch loss=0.011885, epoch loss=0.113435 Batch=359, step=18360, lr=0.123750, batch loss=0.023412, epoch loss=0.136846 Batch=419, step=18420, lr=0.123500, batch loss=0.029483, epoch loss=0.166329 Batch=479, step=18480, lr=0.123250, batch loss=0.018198, epoch loss=0.184527 Batch=539, step=18540, lr=0.123000, batch loss=0.031311, epoch loss=0.215838 Batch=599, step=18600, lr=0.122750, batch loss=0.025919, epoch loss=0.241758 Batch=659, step=18660, lr=0.122500, batch loss=0.030546, epoch loss=0.272303 Batch=719, step=18720, lr=0.122250, batch loss=0.045399, epoch loss=0.317702 Batch=779, step=18780, lr=0.122000, batch loss=0.112649, epoch loss=0.430351 Batch=839, step=18840, lr=0.121750, batch loss=0.054547, epoch loss=0.484898 Batch=899, step=18900, lr=0.121250, batch loss=0.055795, epoch loss=0.540692 Batch=959, step=18960, lr=0.121250, batch loss=0.013715, epoch loss=0.554408 Batch=1019, step=19020, lr=0.120750, batch loss=0.020916, epoch loss=0.575324 Batch=1079, step=19080, lr=0.120750, batch loss=0.009732, epoch loss=0.585056 Batch=1139, step=19140, lr=0.120500, batch loss=0.022672, epoch loss=0.607727 Batch=1199, step=19200, lr=0.120250, batch loss=0.009173, epoch loss=0.616900 Epoch=15, step=19200, lr=0.120250, epoch loss=0.616900 Batch=59, step=19260, lr=0.120000, batch loss=0.004001, epoch loss=0.004001 Batch=119, step=19320, lr=0.119750, batch loss=0.014818, epoch loss=0.018819 Batch=179, step=19380, lr=0.119500, batch loss=0.033925, epoch loss=0.052744 Batch=239, step=19440, lr=0.119250, batch loss=0.020220, epoch loss=0.072964 Batch=299, step=19500, lr=0.119000, batch loss=0.010772, epoch loss=0.083736 Batch=359, step=19560, lr=0.118750, batch loss=0.022846, epoch loss=0.106582 Batch=419, step=19620, lr=0.118250, batch loss=0.019142, epoch loss=0.125725 Batch=479, step=19680, lr=0.118250, batch loss=0.006254, epoch loss=0.131978 Batch=539, step=19740, lr=0.118000, batch loss=0.017765, epoch loss=0.149744 Batch=599, step=19800, lr=0.117750, batch loss=0.021833, epoch loss=0.171577 Batch=659, step=19860, lr=0.117500, batch loss=0.017651, epoch loss=0.189228 Batch=719, step=19920, lr=0.117000, batch loss=0.039083, epoch loss=0.228311 Batch=779, step=19980, lr=0.117000, batch loss=0.076509, epoch loss=0.304820 Batch=839, step=20040, lr=0.116750, batch loss=0.029728, epoch loss=0.334548 Batch=899, step=20100, lr=0.116500, batch loss=0.032093, epoch loss=0.366641 Batch=959, step=20160, lr=0.116250, batch loss=0.012256, epoch loss=0.378898 Batch=1019, step=20220, lr=0.116000, batch loss=0.016606, epoch loss=0.395503 Batch=1079, step=20280, lr=0.115750, batch loss=0.002382, epoch loss=0.397886 Batch=1139, step=20340, lr=0.115500, batch loss=0.014546, epoch loss=0.412432 Batch=1199, step=20400, lr=0.115250, batch loss=0.006929, epoch loss=0.419361 Epoch=16, step=20400, lr=0.115250, epoch loss=0.419361 Batch=59, step=20460, lr=0.115000, batch loss=0.003190, epoch loss=0.003190 Batch=119, step=20520, lr=0.114750, batch loss=0.008672, epoch loss=0.011862 Batch=179, step=20580, lr=0.114250, batch loss=0.020060, epoch loss=0.031922 Batch=239, step=20640, lr=0.114250, batch loss=0.019107, epoch loss=0.051030 Batch=299, step=20700, lr=0.114000, batch loss=0.009718, epoch loss=0.060747 Batch=359, step=20760, lr=0.113750, batch loss=0.013617, epoch loss=0.074365 Batch=419, step=20820, lr=0.113500, batch loss=0.014319, epoch loss=0.088683 Batch=479, step=20880, lr=0.113250, batch loss=0.003471, epoch loss=0.092154 Batch=539, step=20940, lr=0.113000, batch loss=0.018523, epoch loss=0.110677 Batch=599, step=21000, lr=0.112500, batch loss=0.019922, epoch loss=0.130599 Batch=659, step=21060, lr=0.112250, batch loss=0.016154, epoch loss=0.146753 Batch=719, step=21120, lr=0.112000, batch loss=0.035962, epoch loss=0.182715 Batch=779, step=21180, lr=0.111750, batch loss=0.075799, epoch loss=0.258514 Batch=839, step=21240, lr=0.111500, batch loss=0.026428, epoch loss=0.284942 Batch=899, step=21300, lr=0.111500, batch loss=0.033945, epoch loss=0.318887 Batch=959, step=21360, lr=0.111250, batch loss=0.009851, epoch loss=0.328738 Batch=1019, step=21420, lr=0.110750, batch loss=0.011855, epoch loss=0.340593 Batch=1079, step=21480, lr=0.110500, batch loss=0.000518, epoch loss=0.341111 Batch=1139, step=21540, lr=0.110250, batch loss=0.012001, epoch loss=0.353112 Batch=1199, step=21600, lr=0.110250, batch loss=0.004959, epoch loss=0.358072 Epoch=17, step=21600, lr=0.110250, epoch loss=0.358072 Batch=59, step=21660, lr=0.110000, batch loss=0.002146, epoch loss=0.002146 Batch=119, step=21720, lr=0.109750, batch loss=0.006433, epoch loss=0.008579 Batch=179, step=21780, lr=0.109500, batch loss=0.012617, epoch loss=0.021195 Batch=239, step=21840, lr=0.109000, batch loss=0.009132, epoch loss=0.030327 Batch=299, step=21900, lr=0.108750, batch loss=0.004445, epoch loss=0.034772 Batch=359, step=21960, lr=0.108500, batch loss=0.013397, epoch loss=0.048169 Batch=419, step=22020, lr=0.108500, batch loss=0.012057, epoch loss=0.060226 Batch=479, step=22080, lr=0.108250, batch loss=0.003031, epoch loss=0.063257 Batch=539, step=22140, lr=0.108000, batch loss=0.017307, epoch loss=0.080564 Batch=599, step=22200, lr=0.107750, batch loss=0.016999, epoch loss=0.097563 Batch=659, step=22260, lr=0.107500, batch loss=0.014499, epoch loss=0.112062 Batch=719, step=22320, lr=0.107250, batch loss=0.025979, epoch loss=0.138041 Batch=779, step=22380, lr=0.107000, batch loss=0.043290, epoch loss=0.181332 Batch=839, step=22440, lr=0.106750, batch loss=0.021668, epoch loss=0.203000 Batch=899, step=22500, lr=0.106500, batch loss=0.022119, epoch loss=0.225119 Batch=959, step=22560, lr=0.106250, batch loss=0.011018, epoch loss=0.236137 Batch=1019, step=22620, lr=0.106000, batch loss=0.009192, epoch loss=0.245329 Batch=1079, step=22680, lr=0.105750, batch loss=0.000043, epoch loss=0.245372 Batch=1139, step=22740, lr=0.105500, batch loss=0.010086, epoch loss=0.255459 Batch=1199, step=22800, lr=0.105250, batch loss=0.004628, epoch loss=0.260087 Epoch=18, step=22800, lr=0.105250, epoch loss=0.260087 Batch=59, step=22860, lr=0.105000, batch loss=0.001902, epoch loss=0.001902 Batch=119, step=22920, lr=0.104750, batch loss=0.005467, epoch loss=0.007369 Batch=179, step=22980, lr=0.104500, batch loss=0.010518, epoch loss=0.017887 Batch=239, step=23040, lr=0.104250, batch loss=0.009595, epoch loss=0.027482 Batch=299, step=23100, lr=0.104000, batch loss=0.010878, epoch loss=0.038360 Batch=359, step=23160, lr=0.103750, batch loss=0.012372, epoch loss=0.050732 Batch=419, step=23220, lr=0.103500, batch loss=0.010605, epoch loss=0.061337 Batch=479, step=23280, lr=0.103250, batch loss=0.002482, epoch loss=0.063819 Batch=539, step=23340, lr=0.103000, batch loss=0.016880, epoch loss=0.080699 Batch=599, step=23400, lr=0.102750, batch loss=0.013816, epoch loss=0.094515 Batch=659, step=23460, lr=0.102250, batch loss=0.010429, epoch loss=0.104944 Batch=719, step=23520, lr=0.102250, batch loss=0.013719, epoch loss=0.118663 Batch=779, step=23580, lr=0.102000, batch loss=0.022377, epoch loss=0.141040 Batch=839, step=23640, lr=0.101750, batch loss=0.028010, epoch loss=0.169050 Batch=899, step=23700, lr=0.101500, batch loss=0.021417, epoch loss=0.190466 Batch=959, step=23760, lr=0.101250, batch loss=0.009945, epoch loss=0.200411 Batch=1019, step=23820, lr=0.101000, batch loss=0.007239, epoch loss=0.207649 Batch=1079, step=23880, lr=0.100750, batch loss=0.000951, epoch loss=0.208600 Batch=1139, step=23940, lr=0.100500, batch loss=0.008720, epoch loss=0.217320 Batch=1199, step=24000, lr=0.100250, batch loss=0.004598, epoch loss=0.221918 Epoch=19, step=24000, lr=0.100250, epoch loss=0.221918 Half-moons scatterplot and decision boundary: ┌────────────────────────────────────────────────────────────────────────────────────────────────────┐ │********************************#*******************************************************************│ │**********************#*#*#######*###*#####*********************************************************│ │**********************#########################*****************************************************│ │*****************#**########*######*###########*###*************************************************│ │***************#################*###################************************************************│ │************######*#################*#################**********************************************│ │**********#*#####*########*#**************##*#########*#********************************************│ │***********########*##*#******************#*****##########******************************************│ │***********###########*************************############**************************************...│ │********######*####*********************************###*###*#**********************************.....│ │*******######**##*************...******************#*######*#*******************************........│ │*******##*##**##**********...........***************########*##***************************..........│ │*****#######************.......%...%%...***************#########************************..........%.│ │******######***********.........%........***************##*#####***********************.......%.%.%.│ │***#########**********.........%%%.%%......*************#*#######*********************.......%.%%%%.│ │****#######**********..........%%%%.........************#########********************........%%.%%.%│ │**#######************..........%%%%%%%.......**************###*###******************.........%%%%%%.│ │*##*####************...........%%%%%%%.........***********########****************...........%%%%%%.│ │*#######************...........%%%%%%%..........************#######**************............%%%%%%.│ │*##*####***********............%%.%%%%%...........***********####***************............%%%%%%%.│ │*#####*#***********.............%%%%%%%............**********##*###************..............%%%%%..│ │#######***********.............%.%%%%%%.............*********#######**********.............%%%%.%%..│ │#####*#**********...............%%%%%%%...............*******#######********...............%%%%%%%%.│ │###*#*#**********...............%%%%%%%%%..............*******######*******................%%%%%%...│ │#######*********.................%%%%%%%%...............*****###*###******................%%%%%%....│ │######**********.................%%%%%%%%%................***#*###*******...............%%%%%%%%%...│ │*#*##*#********...................%%%%%%%%%%...............***######***..................%%%%%%.....│ │#****##********....................%%%%%%%%%................***###*#**................%.%%%%%%%.....│ │**************.....................%.%%%%%%...................*******..................%.%%.%%......│ │**************.......................%..%%%%%%%................*****..............%.%%%%%%%%%.......│ │*************.........................%.%%%.%%%%................***...............%%%%%%%.%.%.......│ │************............................%..%%%%..%................................%%%%%%%%..........│ │************.............................%%%%%%%%%%%........................%%..%%%%%%%%.%..........│ │***********..............................%%.%%%%%%%%..%....................%..%%%.%%%%%%%...........│ │***********.................................%%%%.%%%%%%%%...............%.%%%%%%%%%%%%.%............│ │**********...................................%%%%%%%%%%%%%%%%%%%%%%.%%%%.%%%%%%%%%%%%%..............│ │**********....................................%%.%%%%%%%%%%%%%%%%%%%%%%.%%%%%%%%%%%.................│ │*********.........................................%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%...................│ │********.............................................%%%.%%%%%%%%%%%%%%%%%%%%%......................│ │********................................................%...%%%%.%%.%%%%..%.........................│ └────────────────────────────────────────────────────────────────────────────────────────────────────┘ 2025-03-20 22:52.16 ---> saved as "6bbb61c55263603ffcbfeb1bec23c6887e47df56eefc8d0ac06f15b41a09a392" Job succeeded 2025-03-20 22:52.17: Job succeeded