2025-05-22 20:00.59: New job: test ahrefs/ocannl https://github.com/ahrefs/ocannl.git#refs/heads/master (39741884b740497ac10065d5e464e6c70f9151f4) (linux-x86_64:ubuntu-24.10-5.3_opam-2.3) Base: ocaml/opam:ubuntu-24.10-ocaml-5.3@sha256:0a25304cf4c0aa68b1e67b003574dfd823cf0db0583f5b689fa7519cb6d82941 Opam project build To reproduce locally: git clone --recursive "https://github.com/ahrefs/ocannl.git" -b "master" && cd "ocannl" && git reset --hard 39741884 cat > Dockerfile <<'END-OF-DOCKERFILE' FROM ocaml/opam:ubuntu-24.10-ocaml-5.3@sha256:0a25304cf4c0aa68b1e67b003574dfd823cf0db0583f5b689fa7519cb6d82941 # ubuntu-24.10-5.3_opam-2.3 USER 1000:1000 ENV CLICOLOR_FORCE="1" ENV OPAMCOLOR="always" WORKDIR /src RUN sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam RUN opam init --reinit -ni RUN uname -rs && opam exec -- ocaml -version && opam --version WORKDIR /src RUN sudo chown opam /src RUN cd ~/opam-repository && (git cat-file -e 2df846cb67d6f96ae4fced111519ff4ae27d19ae || git fetch origin master) && git reset -q --hard 2df846cb67d6f96ae4fced111519ff4ae27d19ae && git log --no-decorate -n1 --oneline && opam update -u COPY --chown=1000:1000 neural_nets_lib.opam arrayjit.opam ./ RUN opam pin add -yn neural_nets_lib.dev './' && \ opam pin add -yn arrayjit.dev './' RUN echo '(lang dune 3.0)' > './dune-project' ENV DEPS="angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.19.0 dune-configurator.3.19.0 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0" ENV CI="true" ENV OCAMLCI="true" RUN opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS RUN opam install $DEPS COPY --chown=1000:1000 . /src RUN opam exec -- dune build @install @check @runtest && rm -rf _build END-OF-DOCKERFILE docker build . END-REPRO-BLOCK 2025-05-22 20:00.59: Using cache hint "ahrefs/ocannl-ocaml/opam:ubuntu-24.10-ocaml-5.3@sha256:0a25304cf4c0aa68b1e67b003574dfd823cf0db0583f5b689fa7519cb6d82941-ubuntu-24.10-5.3_opam-2.3-63d0fa7caba437c680f3f62d33f451da" 2025-05-22 20:00.59: Using OBuilder spec: ((from ocaml/opam:ubuntu-24.10-ocaml-5.3@sha256:0a25304cf4c0aa68b1e67b003574dfd823cf0db0583f5b689fa7519cb6d82941) (comment ubuntu-24.10-5.3_opam-2.3) (user (uid 1000) (gid 1000)) (env CLICOLOR_FORCE 1) (env OPAMCOLOR always) (workdir /src) (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) (run (shell "opam init --reinit -ni")) (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) (workdir /src) (run (shell "sudo chown opam /src")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 2df846cb67d6f96ae4fced111519ff4ae27d19ae || git fetch origin master) && git reset -q --hard 2df846cb67d6f96ae4fced111519ff4ae27d19ae && git log --no-decorate -n1 --oneline && opam update -u")) (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) (env DEPS "angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.19.0 dune-configurator.3.19.0 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") (env CI true) (env OCAMLCI true) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) (copy (src .) (dst /src)) (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) ) 2025-05-22 20:00.59: Waiting for resource in pool OCluster 2025-05-22 20:00.59: Waiting for worker… 2025-05-22 20:01.03: Got resource from pool OCluster Building on laodoke.caelum.ci.dev HEAD is now at 9afb61d2 In progress / broken: Format -> PPrint migration first pass by Claude HEAD is now at 39741884 Untested: convert remaining uses of Format except where printing Sexp values (from ocaml/opam:ubuntu-24.10-ocaml-5.3@sha256:0a25304cf4c0aa68b1e67b003574dfd823cf0db0583f5b689fa7519cb6d82941) 2025-05-22 20:11.46 ---> using "059ab306d13dde6c096265de48f1244fa252a88f7cda19c990bdf6764d9b1b77" from cache /: (comment ubuntu-24.10-5.3_opam-2.3) /: (user (uid 1000) (gid 1000)) /: (env CLICOLOR_FORCE 1) /: (env OPAMCOLOR always) /: (workdir /src) /src: (run (shell "sudo ln -f /usr/bin/opam-2.3 /usr/bin/opam")) 2025-05-22 20:11.46 ---> using "8eafc6b0bab597e974d083037a42ed271ed402b8c190bd88df3ecd967438431a" from cache /src: (run (shell "opam init --reinit -ni")) Configuring from /home/opam/.opamrc and then from built-in defaults. Checking for available remotes: rsync and local, git. - you won't be able to use mercurial repositories unless you install the hg command on your system. - you won't be able to use darcs repositories unless you install the darcs command on your system. This development version of opam requires an update to the layout of /home/opam/.opam from version 2.0 to version 2.2, which can't be reverted. You may want to back it up before going further. Continue? [y/n] y [NOTE] The 'jobs' option was reset, its value was 31 and its new value will vary according to the current number of cores on your machine. You can restore the fixed value using: opam option jobs=31 --global Format upgrade done. <><> Updating repositories ><><><><><><><><><><><><><><><><><><><><><><><><><><> [default] Initialised 2025-05-22 20:11.46 ---> using "485ff6c642bc23d14e63e49934b6c82edf7b317b00a7e9fbbbae86365a2be086" from cache /src: (run (shell "uname -rs && opam exec -- ocaml -version && opam --version")) Linux 5.15.0-139-generic The OCaml toplevel, version 5.3.0 2.3.0 2025-05-22 20:11.46 ---> using "ab887dbb0cc88d825230f3aa2acf967ca5b314f787b839adf7f8a61ab82a89e1" from cache /src: (workdir /src) /src: (run (shell "sudo chown opam /src")) 2025-05-22 20:11.46 ---> using "d1a8c68961392cfb32df5a5d4ae30de3c6bc539bd6e6c6e197e306d60d88d6ab" from cache /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "cd ~/opam-repository && (git cat-file -e 2df846cb67d6f96ae4fced111519ff4ae27d19ae || git fetch origin master) && git reset -q --hard 2df846cb67d6f96ae4fced111519ff4ae27d19ae && git log --no-decorate -n1 --oneline && opam update -u")) From https://github.com/ocaml/opam-repository * branch master -> FETCH_HEAD 35eb2f107a..2df846cb67 master -> origin/master 2df846cb67 Merge pull request #27910 from maiste/release-dune-3.19.0 <><> Updating package repositories ><><><><><><><><><><><><><><><><><><><><><><> [default] synchronised from git+file:///home/opam/opam-repository Everything as up-to-date as possible (run with --verbose to show unavailable upgrades). However, you may "opam upgrade" these packages explicitly, which will ask permission to downgrade or uninstall the conflicting packages. Nothing to do. # To update the current shell environment, run: eval $(opam env) 2025-05-22 20:11.46 ---> using "d28092168a70550c184404c74815b102fa4d3609bcc12c7a78bc0dfd1cb7b911" from cache /src: (copy (src neural_nets_lib.opam arrayjit.opam) (dst ./)) 2025-05-22 20:11.46 ---> saved as "28bcf7930cf49ddbd0bbc4855dcb6ee6c820a1857e83355fe5ddf87832ce6e08" /src: (run (network host) (shell "opam pin add -yn neural_nets_lib.dev './' && \ \nopam pin add -yn arrayjit.dev './'")) [neural_nets_lib.dev] synchronised (file:///src) neural_nets_lib is now pinned to file:///src (version dev) [arrayjit.dev] synchronised (file:///src) arrayjit is now pinned to file:///src (version dev) 2025-05-22 20:11.51 ---> saved as "6ce1720a843598c31630e294343a52757692fd0515a98a6c895d91d12fba9250" /src: (run (network host) (shell "echo '(lang dune 3.0)' > './dune-project'")) 2025-05-22 20:11.52 ---> saved as "98ce0985861a47f01c46d550fbd9bd7dd46cf41a7c167f008cf614dadf5710da" /src: (env DEPS "angstrom.0.16.1 astring.0.8.5 backoff.0.1.1 base.v0.17.2 base-bigarray.base base-domains.base base-effects.base base-nnp.base base-threads.base base-unix.base bigarray-compat.1.1.0 bigstringaf.0.10.0 camlp-streams.5.0.1 cmdliner.1.3.0 conf-libffi.2.0.0 conf-pkg-config.4 cppo.1.8.0 csexp.1.5.2 ctypes.0.23.0 ctypes-foreign.0.23.0 dune.3.19.0 dune-configurator.3.19.0 fieldslib.v0.17.0 fmt.0.10.0 integers.0.7.0 jane-street-headers.v0.17.0 jst-config.v0.17.0 logs.0.8.0 mdx.2.5.0 mtime.2.1.0 multicore-magic.2.3.1 num.1.5-1 ocaml.5.3.0 ocaml-base-compiler.5.3.0 ocaml-compiler.5.3.0 ocaml-compiler-libs.v0.17.0 ocaml-config.3 ocaml-options-vanilla.1 ocaml-syntax-shims.1.0.0 ocaml-version.4.0.0 ocaml_intrinsics_kernel.v0.17.1 ocamlbuild.0.16.1 ocamlfind.1.9.8 parsexp.v0.17.0 pprint.20230830 ppx_assert.v0.17.0 ppx_base.v0.17.0 ppx_cold.v0.17.0 ppx_compare.v0.17.0 ppx_derivers.1.2.1 ppx_deriving.6.0.3 ppx_enumerate.v0.17.0 ppx_expect.v0.17.2 ppx_fields_conv.v0.17.0 ppx_globalize.v0.17.0 ppx_hash.v0.17.0 ppx_here.v0.17.0 ppx_inline_test.v0.17.0 ppx_minidebug.2.2.0 ppx_optcomp.v0.17.0 ppx_sexp_conv.v0.17.0 ppx_string.v0.17.0 ppx_variants_conv.v0.17.0 ppxlib.0.35.0 ppxlib_jane.v0.17.2 printbox.0.12 printbox-ext-plot.0.12 printbox-html.0.12 printbox-md.0.12 printbox-text.0.12 ptime.1.2.0 re.1.12.0 result.1.5 saturn_lockfree.0.5.0 seq.base sexplib.v0.17.0 sexplib0.v0.17.0 stdio.v0.17.0 stdlib-shims.0.3.0 thread-local-storage.0.2 time_now.v0.17.0 topkg.1.0.8 tyxml.4.6.0 uucp.16.0.0 uutf.1.0.4 variantslib.v0.17.0") /src: (env CI true) /src: (env OCAMLCI true) /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam update --depexts && opam install --cli=2.3 --depext-only -y neural_nets_lib.dev arrayjit.dev $DEPS")) + /usr/bin/sudo "apt-get" "update" - Hit:1 http://archive.ubuntu.com/ubuntu oracular InRelease - Get:2 http://archive.ubuntu.com/ubuntu oracular-updates InRelease [126 kB] - Get:3 http://security.ubuntu.com/ubuntu oracular-security InRelease [126 kB] - Get:4 http://archive.ubuntu.com/ubuntu oracular-backports InRelease [126 kB] - Get:5 http://archive.ubuntu.com/ubuntu oracular-updates/main amd64 Packages [534 kB] - Get:6 http://archive.ubuntu.com/ubuntu oracular-updates/restricted amd64 Packages [312 kB] - Get:7 http://archive.ubuntu.com/ubuntu oracular-updates/universe amd64 Packages [323 kB] - Get:8 http://security.ubuntu.com/ubuntu oracular-security/multiverse amd64 Packages [11.8 kB] - Get:9 http://security.ubuntu.com/ubuntu oracular-security/universe amd64 Packages [243 kB] - Get:10 http://security.ubuntu.com/ubuntu oracular-security/restricted amd64 Packages [306 kB] - Get:11 http://security.ubuntu.com/ubuntu oracular-security/main amd64 Packages [396 kB] - Fetched 2504 kB in 1s (3579 kB/s) - Reading package lists... - <><> Synchronising pinned packages ><><><><><><><><><><><><><><><><><><><><><><> [arrayjit.dev] synchronised (file:///src) [neural_nets_lib.dev] synchronised (file:///src) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following system packages will first need to be installed: libffi-dev pkg-config <><> Handling external dependencies <><><><><><><><><><><><><><><><><><><><><><> + /usr/bin/sudo "apt-get" "install" "-qq" "-yy" "libffi-dev" "pkg-config" - debconf: delaying package configuration, since apt-utils is not installed - Selecting previously unselected package libpkgconf3:amd64. - (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 16145 files and directories currently installed.) - Preparing to unpack .../libpkgconf3_1.8.1-3ubuntu1_amd64.deb ... - Unpacking libpkgconf3:amd64 (1.8.1-3ubuntu1) ... - Selecting previously unselected package pkgconf-bin. - Preparing to unpack .../pkgconf-bin_1.8.1-3ubuntu1_amd64.deb ... - Unpacking pkgconf-bin (1.8.1-3ubuntu1) ... - Selecting previously unselected package pkgconf:amd64. - Preparing to unpack .../pkgconf_1.8.1-3ubuntu1_amd64.deb ... - Unpacking pkgconf:amd64 (1.8.1-3ubuntu1) ... - Selecting previously unselected package pkg-config:amd64. - Preparing to unpack .../pkg-config_1.8.1-3ubuntu1_amd64.deb ... - Unpacking pkg-config:amd64 (1.8.1-3ubuntu1) ... - Selecting previously unselected package libffi-dev:amd64. - Preparing to unpack .../libffi-dev_3.4.6-1build1_amd64.deb ... - Unpacking libffi-dev:amd64 (3.4.6-1build1) ... - Setting up libffi-dev:amd64 (3.4.6-1build1) ... - Setting up libpkgconf3:amd64 (1.8.1-3ubuntu1) ... - Setting up pkgconf-bin (1.8.1-3ubuntu1) ... - Setting up pkgconf:amd64 (1.8.1-3ubuntu1) ... - Setting up pkg-config:amd64 (1.8.1-3ubuntu1) ... - Processing triggers for libc-bin (2.40-1ubuntu3.1) ... 2025-05-22 20:12.19 ---> saved as "f87eda723391a90190d0c9cdc6650345d7a01ec03d4b3937a9428016757ee8e8" /src: (run (cache (opam-archives (target /home/opam/.opam/download-cache))) (network host) (shell "opam install $DEPS")) [NOTE] Package ocaml-options-vanilla is already installed (current version is 1). [NOTE] Package ocaml-config is already installed (current version is 3). [NOTE] Package ocaml-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml-base-compiler is already installed (current version is 5.3.0). [NOTE] Package ocaml is already installed (current version is 5.3.0). [NOTE] Package base-unix is already installed (current version is base). [NOTE] Package base-threads is already installed (current version is base). [NOTE] Package base-nnp is already installed (current version is base). [NOTE] Package base-effects is already installed (current version is base). [NOTE] Package base-domains is already installed (current version is base). [NOTE] Package base-bigarray is already installed (current version is base). The following actions will be performed: === install 75 packages - install angstrom 0.16.1 - install astring 0.8.5 - install backoff 0.1.1 - install base v0.17.2 - install bigarray-compat 1.1.0 - install bigstringaf 0.10.0 - install camlp-streams 5.0.1 - install cmdliner 1.3.0 - install conf-libffi 2.0.0 - install conf-pkg-config 4 - install cppo 1.8.0 - install csexp 1.5.2 - install ctypes 0.23.0 - install ctypes-foreign 0.23.0 - install dune 3.19.0 - install dune-configurator 3.19.0 - install fieldslib v0.17.0 - install fmt 0.10.0 - install integers 0.7.0 - install jane-street-headers v0.17.0 - install jst-config v0.17.0 - install logs 0.8.0 - install mdx 2.5.0 - install mtime 2.1.0 - install multicore-magic 2.3.1 - install num 1.5-1 - install ocaml-compiler-libs v0.17.0 - install ocaml-syntax-shims 1.0.0 - install ocaml-version 4.0.0 - install ocaml_intrinsics_kernel v0.17.1 - install ocamlbuild 0.16.1 - install ocamlfind 1.9.8 - install parsexp v0.17.0 - install pprint 20230830 - install ppx_assert v0.17.0 - install ppx_base v0.17.0 - install ppx_cold v0.17.0 - install ppx_compare v0.17.0 - install ppx_derivers 1.2.1 - install ppx_deriving 6.0.3 - install ppx_enumerate v0.17.0 - install ppx_expect v0.17.2 - install ppx_fields_conv v0.17.0 - install ppx_globalize v0.17.0 - install ppx_hash v0.17.0 - install ppx_here v0.17.0 - install ppx_inline_test v0.17.0 - install ppx_minidebug 2.2.0 - install ppx_optcomp v0.17.0 - install ppx_sexp_conv v0.17.0 - install ppx_string v0.17.0 - install ppx_variants_conv v0.17.0 - install ppxlib 0.35.0 - install ppxlib_jane v0.17.2 - install printbox 0.12 - install printbox-ext-plot 0.12 - install printbox-html 0.12 - install printbox-md 0.12 - install printbox-text 0.12 - install ptime 1.2.0 - install re 1.12.0 - install result 1.5 - install saturn_lockfree 0.5.0 - install seq base - install sexplib v0.17.0 - install sexplib0 v0.17.0 - install stdio v0.17.0 - install stdlib-shims 0.3.0 - install thread-local-storage 0.2 - install time_now v0.17.0 - install topkg 1.0.8 - install tyxml 4.6.0 - install uucp 16.0.0 - install uutf 1.0.4 - install variantslib v0.17.0 <><> Processing actions <><><><><><><><><><><><><><><><><><><><><><><><><><><><> -> retrieved backoff.0.1.1 (cached) -> retrieved angstrom.0.16.1 (cached) -> retrieved astring.0.8.5 (cached) -> retrieved bigarray-compat.1.1.0 (cached) -> retrieved bigstringaf.0.10.0 (cached) -> retrieved base.v0.17.2 (cached) -> retrieved camlp-streams.5.0.1 (cached) -> retrieved cmdliner.1.3.0 (cached) -> retrieved cppo.1.8.0 (cached) -> installed conf-pkg-config.4 -> retrieved csexp.1.5.2 (cached) -> retrieved ctypes.0.23.0, ctypes-foreign.0.23.0 (cached) -> installed conf-libffi.2.0.0 -> retrieved fieldslib.v0.17.0 (cached) -> retrieved fmt.0.10.0 (cached) -> retrieved integers.0.7.0 (cached) -> retrieved jane-street-headers.v0.17.0 (cached) -> retrieved jst-config.v0.17.0 (cached) -> retrieved logs.0.8.0 (cached) -> retrieved mtime.2.1.0 (cached) -> retrieved mdx.2.5.0 (cached) -> retrieved multicore-magic.2.3.1 (cached) -> retrieved num.1.5-1 (cached) -> retrieved ocaml-compiler-libs.v0.17.0 (cached) -> retrieved ocaml-syntax-shims.1.0.0 (cached) -> retrieved ocaml-version.4.0.0 (cached) -> retrieved ocaml_intrinsics_kernel.v0.17.1 (cached) -> retrieved ocamlbuild.0.16.1 (cached) -> retrieved ocamlfind.1.9.8 (cached) -> retrieved parsexp.v0.17.0 (cached) -> retrieved pprint.20230830 (cached) -> retrieved ppx_assert.v0.17.0 (cached) -> retrieved ppx_base.v0.17.0 (cached) -> retrieved ppx_cold.v0.17.0 (cached) -> retrieved dune.3.19.0, dune-configurator.3.19.0 (cached) -> retrieved ppx_compare.v0.17.0 (cached) -> retrieved ppx_derivers.1.2.1 (cached) -> retrieved ppx_deriving.6.0.3 (cached) -> retrieved ppx_enumerate.v0.17.0 (cached) -> retrieved ppx_expect.v0.17.2 (cached) -> retrieved ppx_fields_conv.v0.17.0 (cached) -> retrieved ppx_globalize.v0.17.0 (cached) -> retrieved ppx_hash.v0.17.0 (cached) -> installed cmdliner.1.3.0 -> installed num.1.5-1 -> retrieved ppx_here.v0.17.0 (cached) -> retrieved ppx_inline_test.v0.17.0 (cached) -> retrieved ppx_minidebug.2.2.0 (cached) -> retrieved ppx_optcomp.v0.17.0 (cached) -> retrieved ppx_sexp_conv.v0.17.0 (cached) -> retrieved ppx_string.v0.17.0 (cached) -> retrieved ppx_variants_conv.v0.17.0 (cached) -> retrieved ppxlib_jane.v0.17.2 (cached) -> retrieved ptime.1.2.0 (cached) -> retrieved re.1.12.0 (cached) -> retrieved result.1.5 (cached) -> retrieved saturn_lockfree.0.5.0 (cached) -> retrieved seq.base (cached) -> installed seq.base -> retrieved sexplib.v0.17.0 (cached) -> retrieved sexplib0.v0.17.0 (cached) -> retrieved stdio.v0.17.0 (cached) -> retrieved ppxlib.0.35.0 (cached) -> retrieved stdlib-shims.0.3.0 (cached) -> retrieved thread-local-storage.0.2 (cached) -> retrieved time_now.v0.17.0 (cached) -> retrieved printbox.0.12, printbox-ext-plot.0.12, printbox-html.0.12, printbox-md.0.12, printbox-text.0.12 (cached) -> retrieved topkg.1.0.8 (cached) -> retrieved tyxml.4.6.0 (cached) -> retrieved uutf.1.0.4 (cached) -> retrieved variantslib.v0.17.0 (cached) -> retrieved uucp.16.0.0 (cached) -> installed ocamlfind.1.9.8 -> installed ocamlbuild.0.16.1 -> installed topkg.1.0.8 -> installed mtime.2.1.0 -> installed uutf.1.0.4 -> installed fmt.0.10.0 -> installed ptime.1.2.0 -> installed astring.0.8.5 -> installed logs.0.8.0 -> installed dune.3.19.0 -> installed jane-street-headers.v0.17.0 -> installed csexp.1.5.2 -> installed backoff.0.1.1 -> installed bigarray-compat.1.1.0 -> installed camlp-streams.5.0.1 -> installed multicore-magic.2.3.1 -> installed ocaml-version.4.0.0 -> installed ocaml_intrinsics_kernel.v0.17.1 -> installed pprint.20230830 -> installed ppx_derivers.1.2.1 -> installed printbox.0.12 -> installed result.1.5 -> installed sexplib0.v0.17.0 -> installed stdlib-shims.0.3.0 -> installed thread-local-storage.0.2 -> installed ocaml-syntax-shims.1.0.0 -> installed ocaml-compiler-libs.v0.17.0 -> installed re.1.12.0 -> installed cppo.1.8.0 -> installed integers.0.7.0 -> installed saturn_lockfree.0.5.0 -> installed dune-configurator.3.19.0 -> installed parsexp.v0.17.0 -> installed bigstringaf.0.10.0 -> installed sexplib.v0.17.0 -> installed angstrom.0.16.1 -> installed mdx.2.5.0 -> installed tyxml.4.6.0 -> installed printbox-html.0.12 -> installed ctypes.0.23.0 -> installed base.v0.17.2 -> installed ctypes-foreign.0.23.0 -> installed variantslib.v0.17.0 -> installed fieldslib.v0.17.0 -> installed stdio.v0.17.0 -> installed uucp.16.0.0 -> installed printbox-text.0.12 -> installed printbox-md.0.12 -> installed printbox-ext-plot.0.12 -> installed ppxlib.0.35.0 -> installed ppxlib_jane.v0.17.2 -> installed ppx_optcomp.v0.17.0 -> installed ppx_here.v0.17.0 -> installed ppx_cold.v0.17.0 -> installed ppx_variants_conv.v0.17.0 -> installed ppx_fields_conv.v0.17.0 -> installed ppx_globalize.v0.17.0 -> installed ppx_enumerate.v0.17.0 -> installed ppx_deriving.6.0.3 -> installed ppx_compare.v0.17.0 -> installed ppx_sexp_conv.v0.17.0 -> installed ppx_hash.v0.17.0 -> installed ppx_assert.v0.17.0 -> installed ppx_base.v0.17.0 -> installed ppx_minidebug.2.2.0 -> installed jst-config.v0.17.0 -> installed ppx_string.v0.17.0 -> installed time_now.v0.17.0 -> installed ppx_inline_test.v0.17.0 -> installed ppx_expect.v0.17.2 Done. # To update the current shell environment, run: eval $(opam env) 2025-05-22 20:14.43 ---> saved as "70740b20f19206dfcdf836807f958cd5fddc8aa115512faa8b3718c23ea48df6" /src: (copy (src .) (dst /src)) 2025-05-22 20:14.45 ---> saved as "afa8c266659caf3f53f9bee33df6fd2e4f36cbe57ecd952be6a81db227e5d078" /src: (run (shell "opam exec -- dune build @install @check @runtest && rm -rf _build")) (cd _build/default/test/config && ../../arrayjit/bin/read_config.exe --read=backend) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/config/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file Wrote value of 'backend' to ocannl_backend.txt (cd _build/default/test_ppx && ./test_ppx_op_expected.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/default/test_ppx && ./test_ppx_op.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test_ppx/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/1caa10aa9343f680d2c06596b0a70234/default/test/ocannl_config.' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition '' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Found 0, in the config file' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition 'Retrieving commandline, environment, or config file variable ocannl_log_level' -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file File "test/dune", lines 30-40, characters 0-281: 30 | (rule 31 | (alias runtest) 32 | (target 33 | (dir log_files)) 34 | (action 35 | (run 36 | %{dep:micrograd_demo_logging.exe} 37 | "--ocannl_debug_backend=text" 38 | "--ocannl_log_file_stem=micrograd_demo_logging" 39 | "--ocannl_log_main_domain_to_stdout=false" 40 | "--ocannl_debug_log_to_stream_files=true"))) (cd _build/default/test && ./micrograd_demo_logging.exe --ocannl_debug_backend=text --ocannl_log_file_stem=micrograd_demo_logging --ocannl_log_main_domain_to_stdout=false --ocannl_debug_log_to_stream_files=true) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file Retrieving commandline, environment, or config file variable ocannl_backend Found multicore_cc, in the config file Retrieving commandline, environment, or config file variable ocannl_cd_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_prefer_backend_uniformity Found true, in the config file Retrieving commandline, environment, or config file variable ocannl_debug_log_to_stream_files Found true, commandline --ocannl_debug_log_to_stream_files=true Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Fatal error: exception File "src/printbox-text/PrintBox_text.ml", line 212, characters 6-12: Assertion failed Raised at PrintBox_text.Output.Make_out.to_buf_aux_ in file "src/printbox-text/PrintBox_text.ml", line 212, characters 6-50 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 19-42 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from Stdlib__Map.Make.fold in file "map.ml", line 329, characters 26-41 Called from PrintBox_text.Output.Make_out.render in file "src/printbox-text/PrintBox_text.ml", line 242, characters 14-64 Called from PrintBox_text.output in file "src/printbox-text/PrintBox_text.ml", line 851, characters 2-31 Called from Minidebug_runtime.PrintBox.output_box in file "minidebug_runtime.ml", line 1527, characters 19-59 Called from Minidebug_runtime.PrintBox.close_log_impl.close_tree in file "minidebug_runtime.ml", line 1572, characters 6-38 Called from Backends.Add_buffer_retrieval_and_syncing.sync_routine in file "arrayjit/lib/backends.ml", lines 144-172, characters 31-82 Called from Backends.Raise_backend.link in file "arrayjit/lib/backends.ml", lines 454-455, characters 4-92 Re-raised at Backends.Raise_backend.link in file "arrayjit/lib/backends.ml", lines 441-455, characters 23-92 Called from Dune__exe__Micrograd_demo_logging in file "test/micrograd_demo_logging.ml", line 34, characters 13-77 (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition primitive_ops.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition zero2hero_1of7.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition hello_world_op.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition einsum_trivia.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition micrograd_demo.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file (cd _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test && .tutorials.inline-tests/inline-test-runner.exe inline-test-runner tutorials -partition moons_demo_parallel.ml -source-tree-root .. -diff-cmd -) Welcome to OCANNL! Reading configuration defaults from /src/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file File "test/micrograd_demo.ml", line 1, characters 0-0: /usr/bin/git --no-pager diff --no-index --color=always -u _build/default/test/micrograd_demo.ml _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/micrograd_demo.ml.corrected diff --git a/_build/default/test/micrograd_demo.ml b/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/micrograd_demo.ml.corrected index 77e46c6..ab81526 100644 --- a/_build/default/test/micrograd_demo.ml +++ b/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/micrograd_demo.ml.corrected @@ -52,15 +52,14 @@ let%expect_test "Micrograd README basic example" = │├┼───────┤ │ │││ -4.00 │ │ │└┴───────┘ │ - └─────────────────┘ - ┌────────────────────────┐ - │[0]: a shape 0:1 grad_a│ - │┌┬─────────┐ │ - │││axis 0 │ │ - │├┼─────────┤ │ - │││ 1.38e+2 │ │ - │└┴─────────┘ │ - └────────────────────────┘ + └─────────────────┘┌────────────────────────┐ + │[0]: a shape 0:1 grad_a│ + │┌┬─────────┐ │ + │││axis 0 │ │ + │├┼─────────┤ │ + │││ 1.38e+2 │ │ + │└┴─────────┘ │ + └────────────────────────┘ |}]; Tensor.print ~with_code:false ~with_grad:true `Default b; [%expect @@ -72,15 +71,14 @@ let%expect_test "Micrograd README basic example" = │├┼──────┤ │ │││ 2.00 │ │ │└┴──────┘ │ - └─────────────────┘ - ┌────────────────────────┐ - │[2]: b shape 0:1 grad_b│ - │┌┬─────────┐ │ - │││axis 0 │ │ - │├┼─────────┤ │ - │││ 6.45e+2 │ │ - │└┴─────────┘ │ - └────────────────────────┘ + └─────────────────┘┌────────────────────────┐ + │[2]: b shape 0:1 grad_b│ + │┌┬─────────┐ │ + │││axis 0 │ │ + │├┼─────────┤ │ + │││ 6.45e+2 │ │ + │└┴─────────┘ │ + └────────────────────────┘ |}] let%expect_test "Micrograd half-moons example" = File "test/hello_world_op.ml", line 1, characters 0-0: /usr/bin/git --no-pager diff --no-index --color=always -u _build/default/test/hello_world_op.ml _build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/hello_world_op.ml.corrected diff --git a/_build/default/test/hello_world_op.ml b/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/hello_world_op.ml.corrected index ba9d7ef..6bfa309 100644 --- a/_build/default/test/hello_world_op.ml +++ b/_build/.sandbox/f7d6af23e5b7ab37de3fa19401ef3c41/default/test/hello_world_op.ml.corrected @@ -102,36 +102,46 @@ let%expect_test "Print constant tensor" = let%op hey = [ (1, 2, 3); (4, 5, 6) ] in Train.forward_and_forget backend ctx hey; Tensor.print ~with_code:false ~with_grad:false `Inline @@ hey; - [%expect {| [1.00, 2.00, 3.00; 4.00, 5.00, 6.00] |}]; + [%expect {| + [0]: [ 1.00 , 2.00 , 3.00 ; 4.00 , 5.00 , 6.00 ]_hey shape 1:3->0:2 [ + 1.00 , 2.00 , 3.00 + ; 4.00 , 5.00 , 6.00 + ] + |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hey; [%expect {| - ┌─────────────────────────────────────────────────────────────┐ - │[0]: [1.00, 2.00, 3.00; 4.00, 5.00, 6.00]_hey shape 1:3->0:2 │ - │┌──────┬──────────────────┐ │ - ││ │axis 1 │ │ - │├──────┼──────────────────┤ │ - ││axis 0│ 1.00 2.00 3.00 │ │ - ││ │ 4.00 5.00 6.00 │ │ - │└──────┴──────────────────┘ │ - └─────────────────────────────────────────────────────────────┘ + ┌────────────────────────────────────────────────────────────────────────┐ + │[0]: [ 1.00 , 2.00 , 3.00 ; 4.00 , 5.00 , 6.00 ]_hey shape 1:3->0:2 │ + │┌──────┬──────────────────┐ │ + ││ │axis 1 │ │ + │├──────┼──────────────────┤ │ + ││axis 0│ 1.00 2.00 3.00 │ │ + ││ │ 4.00 5.00 6.00 │ │ + │└──────┴──────────────────┘ │ + └────────────────────────────────────────────────────────────────────────┘ |}]; let%op hoo = [| [ 1; 2; 3 ]; [ 4; 5; 6 ] |] in Train.forward_and_forget backend ctx hoo; Tensor.print ~with_code:false ~with_grad:false `Inline @@ hoo; - [%expect {| [|[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]|] |}]; + [%expect {| + [1]: [| [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] |]_hoo shape 0:2|1:3 [| + [ 1.00 ; 2.00 ; 3.00 ] + ; [ 4.00 ; 5.00 ; 6.00 ] + |] + |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hoo; [%expect {| - ┌──────────────────────────────────────────────────────────────────┐ - │[1]: [|[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]|]_hoo shape 0:2|1:3 │ - │┌──────┬──────────────────┐ │ - ││ │axis 1 │ │ - │├──────┼──────────────────┤ │ - ││axis 0│ 1.00 2.00 3.00 │ │ - ││ │ 4.00 5.00 6.00 │ │ - │└──────┴──────────────────┘ │ - └──────────────────────────────────────────────────────────────────┘ + ┌─────────────────────────────────────────────────────────────────────────────┐ + │[1]: [| [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] |]_hoo shape 0:2|1:3 │ + │┌──────┬──────────────────┐ │ + ││ │axis 1 │ │ + │├──────┼──────────────────┤ │ + ││axis 0│ 1.00 2.00 3.00 │ │ + ││ │ 4.00 5.00 6.00 │ │ + │└──────┴──────────────────┘ │ + └─────────────────────────────────────────────────────────────────────────────┘ |}]; let%op hey2 = [ @@ -145,10 +155,12 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ hey2; [%expect {| - [(1.00, 2.00, 3.00), (4.00, 5.00, 6.00); - (7.00, 8.00, 9.00), (10.00, 11.00, 12.00); - (13.00, 14.00, 15.00), (16.00, 17.00, 18.00); - (19.00, 20.00, 21.00), (22.00, 23.00, 24.00)] + [2]: c4x2x3_hey2 shape 1:2,2:3->0:4 [ + ( 1.00 , 2.00 , 3.00 ) , ( 4.00 , 5.00 , 6.00 ) + ; ( 7.00 , 8.00 , 9.00 ) , ( 10.00 , 11.00 , 12.00 ) + ; ( 13.00 , 14.00 , 15.00 ) , ( 16.00 , 17.00 , 18.00 ) + ; ( 19.00 , 20.00 , 21.00 ) , ( 22.00 , 23.00 , 24.00 ) + ] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hey2; [%expect @@ -178,10 +190,12 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ hoo2; [%expect {| - [|[[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]]; - [[7.00; 8.00; 9.00]; [10.00; 11.00; 12.00]]; - [[13.00; 14.00; 15.00]; [16.00; 17.00; 18.00]]; - [[19.00; 20.00; 21.00]; [22.00; 23.00; 24.00]]|] + [3]: c4x2x3_hoo2 shape 0:4|1:2,2:3 [| + [ [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] ] + ; [ [ 7.00 ; 8.00 ; 9.00 ] ; [ 10.00 ; 11.00 ; 12.00 ] ] + ; [ [ 13.00 ; 14.00 ; 15.00 ] ; [ 16.00 ; 17.00 ; 18.00 ] ] + ; [ [ 19.00 ; 20.00 ; 21.00 ] ; [ 22.00 ; 23.00 ; 24.00 ] ] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ hoo2; [%expect @@ -209,10 +223,12 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo; [%expect {| - [|[|[1.00; 2.00; 3.00]; [4.00; 5.00; 6.00]|]; - [|[7.00; 8.00; 9.00]; [10.00; 11.00; 12.00]|]; - [|[13.00; 14.00; 15.00]; [16.00; 17.00; 18.00]|]; - [|[19.00; 20.00; 21.00]; [22.00; 23.00; 24.00]|]|] + [4]: c4x2x3_heyhoo shape 0:4,1:2|2:3 [| + [| [ 1.00 ; 2.00 ; 3.00 ] ; [ 4.00 ; 5.00 ; 6.00 ] |] + ; [| [ 7.00 ; 8.00 ; 9.00 ] ; [ 10.00 ; 11.00 ; 12.00 ] |] + ; [| [ 13.00 ; 14.00 ; 15.00 ] ; [ 16.00 ; 17.00 ; 18.00 ] |] + ; [| [ 19.00 ; 20.00 ; 21.00 ] ; [ 22.00 ; 23.00 ; 24.00 ] |] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo; [%expect @@ -240,15 +256,24 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo2; [%expect {| - [| - [|[[1.00; 31.00]; [2.00; 32.00]; [3.00; 33.00]]; - [[4.00; 34.00]; [5.00; 35.00]; [6.00; 36.00]]|]; - [|[[7.00; 37.00]; [8.00; 38.00]; [9.00; 39.00]]; - [[10.00; 40.00]; [11.00; 41.00]; [12.00; 42.00]]|]; - [|[[13.00; 43.00]; [14.00; 44.00]; [15.00; 45.00]]; - [[16.00; 46.00]; [17.00; 47.00]; [18.00; 48.00]]|]; - [|[[19.00; 49.00]; [20.00; 50.00]; [21.00; 51.00]]; - [[22.00; 52.00]; [23.00; 53.00]; [24.00; 54.00]]|]|] + [5]: c4x2x3x2_heyhoo2 shape 0:4,1:2|2:3,3:2 [| + [| + [ [ 1.00 ; 31.00 ] ; [ 2.00 ; 32.00 ] ; [ 3.00 ; 33.00 ] ] + ; [ [ 4.00 ; 34.00 ] ; [ 5.00 ; 35.00 ] ; [ 6.00 ; 36.00 ] ] + |] + ; [| + [ [ 7.00 ; 37.00 ] ; [ 8.00 ; 38.00 ] ; [ 9.00 ; 39.00 ] ] + ; [ [ 10.00 ; 40.00 ] ; [ 11.00 ; 41.00 ] ; [ 12.00 ; 42.00 ] ] + |] + ; [| + [ [ 13.00 ; 43.00 ] ; [ 14.00 ; 44.00 ] ; [ 15.00 ; 45.00 ] ] + ; [ [ 16.00 ; 46.00 ] ; [ 17.00 ; 47.00 ] ; [ 18.00 ; 48.00 ] ] + |] + ; [| + [ [ 19.00 ; 49.00 ] ; [ 20.00 ; 50.00 ] ; [ 21.00 ; 51.00 ] ] + ; [ [ 22.00 ; 52.00 ] ; [ 23.00 ; 53.00 ] ; [ 24.00 ; 54.00 ] ] + |] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo2; [%expect @@ -293,17 +318,28 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo3; [%expect {| - [| + [6]: c2x2x2x3x2_heyhoo3 shape 0:2,1:2|2:2,3:3,4:2 [| [| - [[[1.00; 31.00]; [2.00; 32.00]; [3.00; 33.00]]; - [[4.00; 34.00]; [5.00; 35.00]; [6.00; 36.00]]]; - [[[7.00; 37.00]; [8.00; 38.00]; [9.00; 39.00]]; - [[10.00; 40.00]; [11.00; 41.00]; [12.00; 42.00]]]|]; - [| - [[[13.00; 43.00]; [14.00; 44.00]; [15.00; 45.00]]; - [[16.00; 46.00]; [17.00; 47.00]; [18.00; 48.00]]]; - [[[19.00; 49.00]; [20.00; 50.00]; [21.00; 51.00]]; - [[22.00; 52.00]; [23.00; 53.00]; [24.00; 54.00]]]|]|] + [ + [ [ 1.00 ; 31.00 ] ; [ 2.00 ; 32.00 ] ; [ 3.00 ; 33.00 ] ] + ; [ [ 4.00 ; 34.00 ] ; [ 5.00 ; 35.00 ] ; [ 6.00 ; 36.00 ] ] + ] + ; [ + [ [ 7.00 ; 37.00 ] ; [ 8.00 ; 38.00 ] ; [ 9.00 ; 39.00 ] ] + ; [ [ 10.00 ; 40.00 ] ; [ 11.00 ; 41.00 ] ; [ 12.00 ; 42.00 ] ] + ] + |] + ; [| + [ + [ [ 13.00 ; 43.00 ] ; [ 14.00 ; 44.00 ] ; [ 15.00 ; 45.00 ] ] + ; [ [ 16.00 ; 46.00 ] ; [ 17.00 ; 47.00 ] ; [ 18.00 ; 48.00 ] ] + ] + ; [ + [ [ 19.00 ; 49.00 ] ; [ 20.00 ; 50.00 ] ; [ 21.00 ; 51.00 ] ] + ; [ [ 22.00 ; 52.00 ] ; [ 23.00 ; 53.00 ] ; [ 24.00 ; 54.00 ] ] + ] + |] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo3; [%expect @@ -353,17 +389,28 @@ let%expect_test "Print constant tensor" = Tensor.print ~with_code:false ~with_grad:false `Inline @@ heyhoo4; [%expect {| - [| - [ - [[1.00, 31.00; 2.00, 32.00; 3.00, 33.00]; - [4.00, 34.00; 5.00, 35.00; 6.00, 36.00]]; - [[7.00, 37.00; 8.00, 38.00; 9.00, 39.00]; - [10.00, 40.00; 11.00, 41.00; 12.00, 42.00]]]; + [7]: c2x2x2x3x2_heyhoo4 shape 0:2|4:2->1:2,2:2,3:3 [| [ - [[13.00, 43.00; 14.00, 44.00; 15.00, 45.00]; - [16.00, 46.00; 17.00, 47.00; 18.00, 48.00]]; - [[19.00, 49.00; 20.00, 50.00; 21.00, 51.00]; - [22.00, 52.00; 23.00, 53.00; 24.00, 54.00]]]|] + [ + [ 1.00 , 31.00 ; 2.00 , 32.00 ; 3.00 , 33.00 ] + ; [ 4.00 , 34.00 ; 5.00 , 35.00 ; 6.00 , 36.00 ] + ] + ; [ + [ 7.00 , 37.00 ; 8.00 , 38.00 ; 9.00 , 39.00 ] + ; [ 10.00 , 40.00 ; 11.00 , 41.00 ; 12.00 , 42.00 ] + ] + ] + ; [ + [ + [ 13.00 , 43.00 ; 14.00 , 44.00 ; 15.00 , 45.00 ] + ; [ 16.00 , 46.00 ; 17.00 , 47.00 ; 18.00 , 48.00 ] + ] + ; [ + [ 19.00 , 49.00 ; 20.00 , 50.00 ; 21.00 , 51.00 ] + ; [ 22.00 , 52.00 ; 23.00 , 53.00 ; 24.00 , 54.00 ] + ] + ] + |] |}]; Tensor.print ~with_code:false ~with_grad:false `Default @@ heyhoo4; [%expect @@ -462,8 +509,29 @@ let%expect_test "Big matrix" = Tensor.print ~with_code:false ~with_grad:false `Inline zero_to_twenty; [%expect {| - [0.00; 1.00; 2.00; 3.00; 4.00; 5.00; 6.00; 7.00; 8.00; 9.00; 10.00; 11.00; - 12.00; 13.00; 14.00; 15.00; 16.00; 17.00; 18.00; 19.00; 20.00] + [2]: 0...20 shape 0:21 [ + 0.00 + ; 1.00 + ; 2.00 + ; 3.00 + ; 4.00 + ; 5.00 + ; 6.00 + ; 7.00 + ; 8.00 + ; 9.00 + ; 10.00 + ; 11.00 + ; 12.00 + ; 13.00 + ; 14.00 + ; 15.00 + ; 16.00 + ; 17.00 + ; 18.00 + ; 19.00 + ; 20.00 + ] |}]; Tensor.print ~with_code:false ~with_grad:false `Default zero_to_twenty; [%expect (cd _build/default/test && ./moons_demo_parallel_run.exe) Welcome to OCANNL! Reading configuration defaults from /src/_build/default/test/ocannl_config. Retrieving commandline, environment, or config file variable ocannl_log_level Found 0, in the config file ("Set log_level to" 1) └─{orphaned from #2} Retrieving commandline, environment, or config file variable ocannl_backend Found multicore_cc, in the config file Properties of devices: (multicore_devices (device ((device_name CPU) (device_ordinal 0) (num_domains 72)))) @!Retrieving commandline, environment, or config file variable ocannl_prefer_backend_uniformity Found true, in the config file Retrieving commandline, environment, or config file variable ocannl_debug_log_to_stream_files Not found, using default false Retrieving commandline, environment, or config file variable ocannl_ll_ident_style Not found, using default heuristic Retrieving commandline, environment, or config file variable ocannl_cc_backend_optimization_level Not found, using default 3 Retrieving commandline, environment, or config file variable ocannl_cc_backend_compiler_command Not found, using default gcc Retrieving commandline, environment, or config file variable ocannl_never_capture_stdout Not found, using default false Batch=59, step=60, lr=0.199750, batch loss=23.609453, epoch loss=23.609453 Batch=119, step=120, lr=0.199500, batch loss=8.516926, epoch loss=32.126379 Batch=179, step=180, lr=0.199500, batch loss=2.639251, epoch loss=34.765630 Batch=239, step=240, lr=0.199250, batch loss=0.850854, epoch loss=35.616485 Batch=299, step=300, lr=0.198750, batch loss=1.448342, epoch loss=37.064827 Batch=359, step=360, lr=0.198750, batch loss=1.333968, epoch loss=38.398795 Batch=419, step=420, lr=0.198500, batch loss=0.616298, epoch loss=39.015092 Batch=479, step=480, lr=0.198250, batch loss=0.809647, epoch loss=39.824739 Batch=539, step=540, lr=0.198000, batch loss=0.711804, epoch loss=40.536544 Batch=599, step=600, lr=0.197750, batch loss=1.067621, epoch loss=41.604164 Batch=659, step=660, lr=0.197500, batch loss=0.482730, epoch loss=42.086894 Batch=719, step=720, lr=0.197250, batch loss=0.411195, epoch loss=42.498089 Batch=779, step=780, lr=0.197000, batch loss=0.469148, epoch loss=42.967237 Batch=839, step=840, lr=0.196750, batch loss=0.444138, epoch loss=43.411375 Batch=899, step=900, lr=0.196500, batch loss=0.382930, epoch loss=43.794306 Batch=959, step=960, lr=0.196250, batch loss=0.241060, epoch loss=44.035365 Batch=1019, step=1020, lr=0.196000, batch loss=0.453436, epoch loss=44.488802 Batch=1079, step=1080, lr=0.195750, batch loss=0.257694, epoch loss=44.746495 Batch=1139, step=1140, lr=0.195500, batch loss=0.341047, epoch loss=45.087542 Batch=1199, step=1200, lr=0.195250, batch loss=0.260909, epoch loss=45.348451 Epoch=0, step=1200, lr=0.195250, epoch loss=45.348451 Batch=59, step=1260, lr=0.195000, batch loss=0.261918, epoch loss=0.261918 Batch=119, step=1320, lr=0.194750, batch loss=0.206645, epoch loss=0.468563 Batch=179, step=1380, lr=0.194500, batch loss=0.247177, epoch loss=0.715740 Batch=239, step=1440, lr=0.194250, batch loss=0.352023, epoch loss=1.067762 Batch=299, step=1500, lr=0.193750, batch loss=0.236519, epoch loss=1.304282 Batch=359, step=1560, lr=0.193750, batch loss=0.311926, epoch loss=1.616208 Batch=419, step=1620, lr=0.193500, batch loss=0.310293, epoch loss=1.926501 Batch=479, step=1680, lr=0.193250, batch loss=0.276682, epoch loss=2.203182 Batch=539, step=1740, lr=0.193000, batch loss=0.211946, epoch loss=2.415128 Batch=599, step=1800, lr=0.192750, batch loss=0.258331, epoch loss=2.673459 Batch=659, step=1860, lr=0.192500, batch loss=0.369058, epoch loss=3.042516 Batch=719, step=1920, lr=0.192250, batch loss=0.357009, epoch loss=3.399525 Batch=779, step=1980, lr=0.192000, batch loss=0.382593, epoch loss=3.782118 Batch=839, step=2040, lr=0.191500, batch loss=0.341178, epoch loss=4.123296 Batch=899, step=2100, lr=0.191250, batch loss=0.297337, epoch loss=4.420633 Batch=959, step=2160, lr=0.191250, batch loss=0.221693, epoch loss=4.642326 Batch=1019, step=2220, lr=0.191000, batch loss=0.327972, epoch loss=4.970298 Batch=1079, step=2280, lr=0.190750, batch loss=0.198511, epoch loss=5.168809 Batch=1139, step=2340, lr=0.190500, batch loss=0.269913, epoch loss=5.438722 Batch=1199, step=2400, lr=0.190250, batch loss=0.216296, epoch loss=5.655018 Epoch=1, step=2400, lr=0.190250, epoch loss=5.655018 Batch=59, step=2460, lr=0.190000, batch loss=0.227146, epoch loss=0.227146 Batch=119, step=2520, lr=0.189500, batch loss=0.185876, epoch loss=0.413022 Batch=179, step=2580, lr=0.189250, batch loss=0.217954, epoch loss=0.630976 Batch=239, step=2640, lr=0.189250, batch loss=0.327806, epoch loss=0.958783 Batch=299, step=2700, lr=0.189000, batch loss=0.219404, epoch loss=1.178187 Batch=359, step=2760, lr=0.188750, batch loss=0.299348, epoch loss=1.477535 Batch=419, step=2820, lr=0.188500, batch loss=0.294324, epoch loss=1.771859 Batch=479, step=2880, lr=0.188250, batch loss=0.275400, epoch loss=2.047259 Batch=539, step=2940, lr=0.188000, batch loss=0.205879, epoch loss=2.253138 Batch=599, step=3000, lr=0.187750, batch loss=0.257785, epoch loss=2.510924 Batch=659, step=3060, lr=0.187500, batch loss=0.349324, epoch loss=2.860247 Batch=719, step=3120, lr=0.187250, batch loss=0.347939, epoch loss=3.208186 Batch=779, step=3180, lr=0.187000, batch loss=0.363164, epoch loss=3.571350 Batch=839, step=3240, lr=0.186750, batch loss=0.325690, epoch loss=3.897040 Batch=899, step=3300, lr=0.186500, batch loss=0.294742, epoch loss=4.191782 Batch=959, step=3360, lr=0.186250, batch loss=0.228767, epoch loss=4.420549 Batch=1019, step=3420, lr=0.186000, batch loss=0.338788, epoch loss=4.759337 Batch=1079, step=3480, lr=0.185750, batch loss=0.194597, epoch loss=4.953933 Batch=1139, step=3540, lr=0.185500, batch loss=0.245148, epoch loss=5.199082 Batch=1199, step=3600, lr=0.185250, batch loss=0.199342, epoch loss=5.398424 Epoch=2, step=3600, lr=0.185250, epoch loss=5.398424 Batch=59, step=3660, lr=0.185000, batch loss=0.225663, epoch loss=0.225663 Batch=119, step=3720, lr=0.184750, batch loss=0.189736, epoch loss=0.415399 Batch=179, step=3780, lr=0.184500, batch loss=0.211727, epoch loss=0.627126 Batch=239, step=3840, lr=0.184250, batch loss=0.315662, epoch loss=0.942788 Batch=299, step=3900, lr=0.184000, batch loss=0.207523, epoch loss=1.150311 Batch=359, step=3960, lr=0.183750, batch loss=0.285845, epoch loss=1.436156 Batch=419, step=4020, lr=0.183500, batch loss=0.279733, epoch loss=1.715889 Batch=479, step=4080, lr=0.183250, batch loss=0.254340, epoch loss=1.970229 Batch=539, step=4140, lr=0.183000, batch loss=0.202477, epoch loss=2.172706 Batch=599, step=4200, lr=0.182750, batch loss=0.252724, epoch loss=2.425430 Batch=659, step=4260, lr=0.182500, batch loss=0.333569, epoch loss=2.758999 Batch=719, step=4320, lr=0.182250, batch loss=0.337620, epoch loss=3.096619 Batch=779, step=4380, lr=0.182000, batch loss=0.353794, epoch loss=3.450413 Batch=839, step=4440, lr=0.181750, batch loss=0.315897, epoch loss=3.766310 Batch=899, step=4500, lr=0.181500, batch loss=0.287555, epoch loss=4.053865 Batch=959, step=4560, lr=0.181000, batch loss=0.246950, epoch loss=4.300816 Batch=1019, step=4620, lr=0.181000, batch loss=0.350206, epoch loss=4.651022 Batch=1079, step=4680, lr=0.180750, batch loss=0.209148, epoch loss=4.860169 Batch=1139, step=4740, lr=0.180500, batch loss=0.246341, epoch loss=5.106510 Batch=1199, step=4800, lr=0.180000, batch loss=0.192534, epoch loss=5.299044 Epoch=3, step=4800, lr=0.180000, epoch loss=5.299044 Batch=59, step=4860, lr=0.180000, batch loss=0.229421, epoch loss=0.229421 Batch=119, step=4920, lr=0.179750, batch loss=0.188581, epoch loss=0.418002 Batch=179, step=4980, lr=0.179500, batch loss=0.206519, epoch loss=0.624521 Batch=239, step=5040, lr=0.179250, batch loss=0.307692, epoch loss=0.932213 Batch=299, step=5100, lr=0.179000, batch loss=0.201587, epoch loss=1.133800 Batch=359, step=5160, lr=0.178750, batch loss=0.274097, epoch loss=1.407897 Batch=419, step=5220, lr=0.178500, batch loss=0.265135, epoch loss=1.673032 Batch=479, step=5280, lr=0.178250, batch loss=0.241397, epoch loss=1.914429 Batch=539, step=5340, lr=0.177750, batch loss=0.191803, epoch loss=2.106231 Batch=599, step=5400, lr=0.177750, batch loss=0.229846, epoch loss=2.336077 Batch=659, step=5460, lr=0.177500, batch loss=0.325502, epoch loss=2.661579 Batch=719, step=5520, lr=0.177250, batch loss=0.332541, epoch loss=2.994120 Batch=779, step=5580, lr=0.177000, batch loss=0.342864, epoch loss=3.336983 Batch=839, step=5640, lr=0.176750, batch loss=0.309580, epoch loss=3.646564 Batch=899, step=5700, lr=0.176500, batch loss=0.273492, epoch loss=3.920055 Batch=959, step=5760, lr=0.176250, batch loss=0.214318, epoch loss=4.134374 Batch=1019, step=5820, lr=0.176000, batch loss=0.335407, epoch loss=4.469781 Batch=1079, step=5880, lr=0.175750, batch loss=0.190125, epoch loss=4.659906 Batch=1139, step=5940, lr=0.175500, batch loss=0.221681, epoch loss=4.881587 Batch=1199, step=6000, lr=0.175250, batch loss=0.186861, epoch loss=5.068448 Epoch=4, step=6000, lr=0.175250, epoch loss=5.068448 Batch=59, step=6060, lr=0.175000, batch loss=0.223890, epoch loss=0.223890 Batch=119, step=6120, lr=0.174750, batch loss=0.186354, epoch loss=0.410244 Batch=179, step=6180, lr=0.174500, batch loss=0.200866, epoch loss=0.611110 Batch=239, step=6240, lr=0.174250, batch loss=0.299605, epoch loss=0.910715 Batch=299, step=6300, lr=0.174000, batch loss=0.206560, epoch loss=1.117274 Batch=359, step=6360, lr=0.173750, batch loss=0.267280, epoch loss=1.384554 Batch=419, step=6420, lr=0.173500, batch loss=0.264405, epoch loss=1.648959 Batch=479, step=6480, lr=0.173250, batch loss=0.245216, epoch loss=1.894175 Batch=539, step=6540, lr=0.173000, batch loss=0.192009, epoch loss=2.086184 Batch=599, step=6600, lr=0.172750, batch loss=0.230662, epoch loss=2.316846 Batch=659, step=6660, lr=0.172500, batch loss=0.314277, epoch loss=2.631123 Batch=719, step=6720, lr=0.172250, batch loss=0.316669, epoch loss=2.947792 Batch=779, step=6780, lr=0.172000, batch loss=0.330501, epoch loss=3.278293 Batch=839, step=6840, lr=0.171750, batch loss=0.305898, epoch loss=3.584191 Batch=899, step=6900, lr=0.171250, batch loss=0.265058, epoch loss=3.849249 Batch=959, step=6960, lr=0.171250, batch loss=0.218639, epoch loss=4.067888 Batch=1019, step=7020, lr=0.171000, batch loss=0.322717, epoch loss=4.390604 Batch=1079, step=7080, lr=0.170750, batch loss=0.177781, epoch loss=4.568385 Batch=1139, step=7140, lr=0.170500, batch loss=0.213832, epoch loss=4.782216 Batch=1199, step=7200, lr=0.170250, batch loss=0.183146, epoch loss=4.965362 Epoch=5, step=7200, lr=0.170250, epoch loss=4.965362 Batch=59, step=7260, lr=0.170000, batch loss=0.240895, epoch loss=0.240895 Batch=119, step=7320, lr=0.169750, batch loss=0.179856, epoch loss=0.420751 Batch=179, step=7380, lr=0.169500, batch loss=0.194852, epoch loss=0.615603 Batch=239, step=7440, lr=0.169250, batch loss=0.291700, epoch loss=0.907303 Batch=299, step=7500, lr=0.169000, batch loss=0.204262, epoch loss=1.111564 Batch=359, step=7560, lr=0.168750, batch loss=0.256577, epoch loss=1.368141 Batch=419, step=7620, lr=0.168500, batch loss=0.258758, epoch loss=1.626899 Batch=479, step=7680, lr=0.168250, batch loss=0.235264, epoch loss=1.862163 Batch=539, step=7740, lr=0.168000, batch loss=0.190369, epoch loss=2.052532 Batch=599, step=7800, lr=0.167750, batch loss=0.229295, epoch loss=2.281827 Batch=659, step=7860, lr=0.167500, batch loss=0.304574, epoch loss=2.586402 Batch=719, step=7920, lr=0.167250, batch loss=0.309238, epoch loss=2.895640 Batch=779, step=7980, lr=0.167000, batch loss=0.328485, epoch loss=3.224124 Batch=839, step=8040, lr=0.166750, batch loss=0.291794, epoch loss=3.515918 Batch=899, step=8100, lr=0.166250, batch loss=0.262882, epoch loss=3.778800 Batch=959, step=8160, lr=0.166250, batch loss=0.197752, epoch loss=3.976552 Batch=1019, step=8220, lr=0.166000, batch loss=0.324856, epoch loss=4.301408 Batch=1079, step=8280, lr=0.165750, batch loss=0.186296, epoch loss=4.487703 Batch=1139, step=8340, lr=0.165500, batch loss=0.220688, epoch loss=4.708391 Batch=1199, step=8400, lr=0.165250, batch loss=0.174174, epoch loss=4.882565 Epoch=6, step=8400, lr=0.165250, epoch loss=4.882565 Batch=59, step=8460, lr=0.165000, batch loss=0.211964, epoch loss=0.211964 Batch=119, step=8520, lr=0.164750, batch loss=0.172639, epoch loss=0.384603 Batch=179, step=8580, lr=0.164500, batch loss=0.188950, epoch loss=0.573553 Batch=239, step=8640, lr=0.164250, batch loss=0.280417, epoch loss=0.853970 Batch=299, step=8700, lr=0.164000, batch loss=0.190841, epoch loss=1.044811 Batch=359, step=8760, lr=0.163500, batch loss=0.250205, epoch loss=1.295016 Batch=419, step=8820, lr=0.163500, batch loss=0.245780, epoch loss=1.540796 Batch=479, step=8880, lr=0.163000, batch loss=0.229646, epoch loss=1.770442 Batch=539, step=8940, lr=0.163000, batch loss=0.178087, epoch loss=1.948528 Batch=599, step=9000, lr=0.162750, batch loss=0.218551, epoch loss=2.167079 Batch=659, step=9060, lr=0.162500, batch loss=0.295644, epoch loss=2.462723 Batch=719, step=9120, lr=0.162250, batch loss=0.296648, epoch loss=2.759372 Batch=779, step=9180, lr=0.162000, batch loss=0.316563, epoch loss=3.075934 Batch=839, step=9240, lr=0.161750, batch loss=0.288353, epoch loss=3.364287 Batch=899, step=9300, lr=0.161500, batch loss=0.251361, epoch loss=3.615648 Batch=959, step=9360, lr=0.161250, batch loss=0.190785, epoch loss=3.806434 Batch=1019, step=9420, lr=0.161000, batch loss=0.315569, epoch loss=4.122003 Batch=1079, step=9480, lr=0.160750, batch loss=0.188096, epoch loss=4.310099 Batch=1139, step=9540, lr=0.160500, batch loss=0.212447, epoch loss=4.522546 Batch=1199, step=9600, lr=0.160250, batch loss=0.167514, epoch loss=4.690060 Epoch=7, step=9600, lr=0.160250, epoch loss=4.690060 Batch=59, step=9660, lr=0.160000, batch loss=0.203891, epoch loss=0.203891 Batch=119, step=9720, lr=0.159750, batch loss=0.169498, epoch loss=0.373388 Batch=179, step=9780, lr=0.159500, batch loss=0.180116, epoch loss=0.553504 Batch=239, step=9840, lr=0.159250, batch loss=0.262944, epoch loss=0.816448 Batch=299, step=9900, lr=0.159000, batch loss=0.186926, epoch loss=1.003374 Batch=359, step=9960, lr=0.158750, batch loss=0.240469, epoch loss=1.243843 Batch=419, step=10020, lr=0.158500, batch loss=0.233629, epoch loss=1.477472 Batch=479, step=10080, lr=0.158250, batch loss=0.215194, epoch loss=1.692666 Batch=539, step=10140, lr=0.158000, batch loss=0.170955, epoch loss=1.863621 Batch=599, step=10200, lr=0.157750, batch loss=0.202469, epoch loss=2.066090 Batch=659, step=10260, lr=0.157500, batch loss=0.282938, epoch loss=2.349028 Batch=719, step=10320, lr=0.157250, batch loss=0.280976, epoch loss=2.630004 Batch=779, step=10380, lr=0.157000, batch loss=0.300242, epoch loss=2.930246 Batch=839, step=10440, lr=0.156750, batch loss=0.273133, epoch loss=3.203379 Batch=899, step=10500, lr=0.156500, batch loss=0.239455, epoch loss=3.442834 Batch=959, step=10560, lr=0.156000, batch loss=0.200595, epoch loss=3.643429 Batch=1019, step=10620, lr=0.156000, batch loss=0.280488, epoch loss=3.923918 Batch=1079, step=10680, lr=0.155750, batch loss=0.149936, epoch loss=4.073853 Batch=1139, step=10740, lr=0.155500, batch loss=0.181981, epoch loss=4.255834 Batch=1199, step=10800, lr=0.155250, batch loss=0.165250, epoch loss=4.421084 Epoch=8, step=10800, lr=0.155250, epoch loss=4.421084 Batch=59, step=10860, lr=0.155000, batch loss=0.192189, epoch loss=0.192189 Batch=119, step=10920, lr=0.154750, batch loss=0.164391, epoch loss=0.356580 Batch=179, step=10980, lr=0.154250, batch loss=0.166924, epoch loss=0.523504 Batch=239, step=11040, lr=0.154250, batch loss=0.244303, epoch loss=0.767807 Batch=299, step=11100, lr=0.154000, batch loss=0.168343, epoch loss=0.936151 Batch=359, step=11160, lr=0.153500, batch loss=0.222939, epoch loss=1.159090 Batch=419, step=11220, lr=0.153500, batch loss=0.219053, epoch loss=1.378143 Batch=479, step=11280, lr=0.153250, batch loss=0.211593, epoch loss=1.589736 Batch=539, step=11340, lr=0.153000, batch loss=0.165722, epoch loss=1.755458 Batch=599, step=11400, lr=0.152750, batch loss=0.180797, epoch loss=1.936255 Batch=659, step=11460, lr=0.152500, batch loss=0.266331, epoch loss=2.202587 Batch=719, step=11520, lr=0.152250, batch loss=0.261616, epoch loss=2.464202 Batch=779, step=11580, lr=0.152000, batch loss=0.273635, epoch loss=2.737837 Batch=839, step=11640, lr=0.151750, batch loss=0.251185, epoch loss=2.989022 Batch=899, step=11700, lr=0.151500, batch loss=0.220253, epoch loss=3.209275 Batch=959, step=11760, lr=0.151000, batch loss=0.186876, epoch loss=3.396152 Batch=1019, step=11820, lr=0.151000, batch loss=0.277061, epoch loss=3.673213 Batch=1079, step=11880, lr=0.150750, batch loss=0.149643, epoch loss=3.822856 Batch=1139, step=11940, lr=0.150500, batch loss=0.180550, epoch loss=4.003406 Batch=1199, step=12000, lr=0.150250, batch loss=0.141206, epoch loss=4.144612 Epoch=9, step=12000, lr=0.150250, epoch loss=4.144612 Batch=59, step=12060, lr=0.150000, batch loss=0.163165, epoch loss=0.163165 Batch=119, step=12120, lr=0.149750, batch loss=0.136980, epoch loss=0.300145 Batch=179, step=12180, lr=0.149500, batch loss=0.152020, epoch loss=0.452165 Batch=239, step=12240, lr=0.149250, batch loss=0.220984, epoch loss=0.673149 Batch=299, step=12300, lr=0.149000, batch loss=0.143844, epoch loss=0.816993 Batch=359, step=12360, lr=0.148750, batch loss=0.198397, epoch loss=1.015391 Batch=419, step=12420, lr=0.148500, batch loss=0.208012, epoch loss=1.223403 Batch=479, step=12480, lr=0.148250, batch loss=0.180196, epoch loss=1.403599 Batch=539, step=12540, lr=0.147750, batch loss=0.143262, epoch loss=1.546860 Batch=599, step=12600, lr=0.147750, batch loss=0.150087, epoch loss=1.696947 Batch=659, step=12660, lr=0.147250, batch loss=0.228936, epoch loss=1.925883 Batch=719, step=12720, lr=0.147250, batch loss=0.242172, epoch loss=2.168055 Batch=779, step=12780, lr=0.146750, batch loss=0.264758, epoch loss=2.432812 Batch=839, step=12840, lr=0.146750, batch loss=0.237891, epoch loss=2.670703 Batch=899, step=12900, lr=0.146500, batch loss=0.204984, epoch loss=2.875687 Batch=959, step=12960, lr=0.146250, batch loss=0.151384, epoch loss=3.027071 Batch=1019, step=13020, lr=0.146000, batch loss=0.271056, epoch loss=3.298127 Batch=1079, step=13080, lr=0.145500, batch loss=0.110411, epoch loss=3.408538 Batch=1139, step=13140, lr=0.145500, batch loss=0.152098, epoch loss=3.560635 Batch=1199, step=13200, lr=0.145250, batch loss=0.119214, epoch loss=3.679849 Epoch=10, step=13200, lr=0.145250, epoch loss=3.679849 Batch=59, step=13260, lr=0.145000, batch loss=0.139299, epoch loss=0.139299 Batch=119, step=13320, lr=0.144750, batch loss=0.118187, epoch loss=0.257486 Batch=179, step=13380, lr=0.144500, batch loss=0.129916, epoch loss=0.387401 Batch=239, step=13440, lr=0.144250, batch loss=0.186724, epoch loss=0.574125 Batch=299, step=13500, lr=0.144000, batch loss=0.121994, epoch loss=0.696119 Batch=359, step=13560, lr=0.143750, batch loss=0.164482, epoch loss=0.860601 Batch=419, step=13620, lr=0.143500, batch loss=0.163711, epoch loss=1.024312 Batch=479, step=13680, lr=0.143250, batch loss=0.147335, epoch loss=1.171647 Batch=539, step=13740, lr=0.142750, batch loss=0.121067, epoch loss=1.292714 Batch=599, step=13800, lr=0.142750, batch loss=0.121986, epoch loss=1.414701 Batch=659, step=13860, lr=0.142500, batch loss=0.179717, epoch loss=1.594418 Batch=719, step=13920, lr=0.142250, batch loss=0.179391, epoch loss=1.773809 Batch=779, step=13980, lr=0.142000, batch loss=0.200517, epoch loss=1.974325 Batch=839, step=14040, lr=0.141750, batch loss=0.186342, epoch loss=2.160667 Batch=899, step=14100, lr=0.141500, batch loss=0.163567, epoch loss=2.324235 Batch=959, step=14160, lr=0.141250, batch loss=0.158001, epoch loss=2.482236 Batch=1019, step=14220, lr=0.141000, batch loss=0.377961, epoch loss=2.860196 Batch=1079, step=14280, lr=0.140750, batch loss=0.075377, epoch loss=2.935573 Batch=1139, step=14340, lr=0.140500, batch loss=0.128468, epoch loss=3.064042 Batch=1199, step=14400, lr=0.140250, batch loss=0.097015, epoch loss=3.161057 Epoch=11, step=14400, lr=0.140250, epoch loss=3.161057 Batch=59, step=14460, lr=0.140000, batch loss=0.122449, epoch loss=0.122449 Batch=119, step=14520, lr=0.139500, batch loss=0.107695, epoch loss=0.230144 Batch=179, step=14580, lr=0.139500, batch loss=0.109235, epoch loss=0.339379 Batch=239, step=14640, lr=0.139250, batch loss=0.146525, epoch loss=0.485903 Batch=299, step=14700, lr=0.138750, batch loss=0.082038, epoch loss=0.567941 Batch=359, step=14760, lr=0.138750, batch loss=0.122723, epoch loss=0.690664 Batch=419, step=14820, lr=0.138500, batch loss=0.130626, epoch loss=0.821290 Batch=479, step=14880, lr=0.138250, batch loss=0.102344, epoch loss=0.923633 Batch=539, step=14940, lr=0.138000, batch loss=0.089866, epoch loss=1.013500 Batch=599, step=15000, lr=0.137750, batch loss=0.088555, epoch loss=1.102055 Batch=659, step=15060, lr=0.137500, batch loss=0.136974, epoch loss=1.239029 Batch=719, step=15120, lr=0.137250, batch loss=0.163757, epoch loss=1.402786 Batch=779, step=15180, lr=0.137000, batch loss=0.255077, epoch loss=1.657862 Batch=839, step=15240, lr=0.136750, batch loss=0.134692, epoch loss=1.792554 Batch=899, step=15300, lr=0.136500, batch loss=0.137161, epoch loss=1.929715 Batch=959, step=15360, lr=0.136250, batch loss=0.089365, epoch loss=2.019081 Batch=1019, step=15420, lr=0.136000, batch loss=0.154455, epoch loss=2.173536 Batch=1079, step=15480, lr=0.135750, batch loss=0.053731, epoch loss=2.227267 Batch=1139, step=15540, lr=0.135500, batch loss=0.102882, epoch loss=2.330149 Batch=1199, step=15600, lr=0.135250, batch loss=0.060820, epoch loss=2.390970 Epoch=12, step=15600, lr=0.135250, epoch loss=2.390970 Batch=59, step=15660, lr=0.135000, batch loss=0.092768, epoch loss=0.092768 Batch=119, step=15720, lr=0.134750, batch loss=0.146813, epoch loss=0.239581 Batch=179, step=15780, lr=0.134500, batch loss=0.104695, epoch loss=0.344275 Batch=239, step=15840, lr=0.134250, batch loss=0.103984, epoch loss=0.448259 Batch=299, step=15900, lr=0.134000, batch loss=0.044223, epoch loss=0.492483 Batch=359, step=15960, lr=0.133750, batch loss=0.087088, epoch loss=0.579571 Batch=419, step=16020, lr=0.133500, batch loss=0.081577, epoch loss=0.661148 Batch=479, step=16080, lr=0.133250, batch loss=0.058362, epoch loss=0.719510 Batch=539, step=16140, lr=0.133000, batch loss=0.067994, epoch loss=0.787505 Batch=599, step=16200, lr=0.132750, batch loss=0.147955, epoch loss=0.935459 Batch=659, step=16260, lr=0.132500, batch loss=0.089447, epoch loss=1.024906 Batch=719, step=16320, lr=0.132250, batch loss=0.142144, epoch loss=1.167051 Batch=779, step=16380, lr=0.131750, batch loss=0.315118, epoch loss=1.482169 Batch=839, step=16440, lr=0.131750, batch loss=0.101974, epoch loss=1.584142 Batch=899, step=16500, lr=0.131500, batch loss=0.090821, epoch loss=1.674963 Batch=959, step=16560, lr=0.131250, batch loss=0.035541, epoch loss=1.710504 Batch=1019, step=16620, lr=0.130750, batch loss=0.064329, epoch loss=1.774832 Batch=1079, step=16680, lr=0.130750, batch loss=0.054160, epoch loss=1.828993 Batch=1139, step=16740, lr=0.130500, batch loss=0.091044, epoch loss=1.920037 Batch=1199, step=16800, lr=0.130250, batch loss=0.043978, epoch loss=1.964015 Epoch=13, step=16800, lr=0.130250, epoch loss=1.964015 Batch=59, step=16860, lr=0.129750, batch loss=0.037554, epoch loss=0.037554 Batch=119, step=16920, lr=0.129750, batch loss=0.038395, epoch loss=0.075949 Batch=179, step=16980, lr=0.129500, batch loss=0.045965, epoch loss=0.121913 Batch=239, step=17040, lr=0.129250, batch loss=0.062501, epoch loss=0.184415 Batch=299, step=17100, lr=0.129000, batch loss=0.020709, epoch loss=0.205123 Batch=359, step=17160, lr=0.128500, batch loss=0.045244, epoch loss=0.250367 Batch=419, step=17220, lr=0.128500, batch loss=0.067245, epoch loss=0.317612 Batch=479, step=17280, lr=0.128250, batch loss=0.025020, epoch loss=0.342632 Batch=539, step=17340, lr=0.128000, batch loss=0.029017, epoch loss=0.371649 Batch=599, step=17400, lr=0.127750, batch loss=0.037884, epoch loss=0.409533 Batch=659, step=17460, lr=0.127500, batch loss=0.054848, epoch loss=0.464381 Batch=719, step=17520, lr=0.127000, batch loss=0.071053, epoch loss=0.535434 Batch=779, step=17580, lr=0.127000, batch loss=0.069166, epoch loss=0.604600 Batch=839, step=17640, lr=0.126750, batch loss=0.101782, epoch loss=0.706382 Batch=899, step=17700, lr=0.126500, batch loss=0.052037, epoch loss=0.758419 Batch=959, step=17760, lr=0.126250, batch loss=0.018801, epoch loss=0.777220 Batch=1019, step=17820, lr=0.126000, batch loss=0.028359, epoch loss=0.805578 Batch=1079, step=17880, lr=0.125750, batch loss=0.023730, epoch loss=0.829308 Batch=1139, step=17940, lr=0.125500, batch loss=0.054930, epoch loss=0.884239 Batch=1199, step=18000, lr=0.125000, batch loss=0.021646, epoch loss=0.905885 Epoch=14, step=18000, lr=0.125000, epoch loss=0.905885 Batch=59, step=18060, lr=0.125000, batch loss=0.013636, epoch loss=0.013636 Batch=119, step=18120, lr=0.124500, batch loss=0.020693, epoch loss=0.034329 Batch=179, step=18180, lr=0.124500, batch loss=0.030312, epoch loss=0.064642 Batch=239, step=18240, lr=0.124250, batch loss=0.035522, epoch loss=0.100163 Batch=299, step=18300, lr=0.124000, batch loss=0.010290, epoch loss=0.110453 Batch=359, step=18360, lr=0.123750, batch loss=0.025752, epoch loss=0.136206 Batch=419, step=18420, lr=0.123500, batch loss=0.041881, epoch loss=0.178086 Batch=479, step=18480, lr=0.123250, batch loss=0.016504, epoch loss=0.194590 Batch=539, step=18540, lr=0.123000, batch loss=0.022845, epoch loss=0.217435 Batch=599, step=18600, lr=0.122750, batch loss=0.028299, epoch loss=0.245734 Batch=659, step=18660, lr=0.122500, batch loss=0.035059, epoch loss=0.280793 Batch=719, step=18720, lr=0.122250, batch loss=0.065917, epoch loss=0.346710 Batch=779, step=18780, lr=0.121750, batch loss=0.044995, epoch loss=0.391705 Batch=839, step=18840, lr=0.121500, batch loss=0.078914, epoch loss=0.470620 Batch=899, step=18900, lr=0.121500, batch loss=0.036034, epoch loss=0.506654 Batch=959, step=18960, lr=0.121000, batch loss=0.013680, epoch loss=0.520334 Batch=1019, step=19020, lr=0.121000, batch loss=0.025350, epoch loss=0.545684 Batch=1079, step=19080, lr=0.120750, batch loss=0.014747, epoch loss=0.560430 Batch=1139, step=19140, lr=0.120500, batch loss=0.028784, epoch loss=0.589215 Batch=1199, step=19200, lr=0.120250, batch loss=0.012095, epoch loss=0.601310 Epoch=15, step=19200, lr=0.120250, epoch loss=0.601310 Batch=59, step=19260, lr=0.120000, batch loss=0.011548, epoch loss=0.011548 Batch=119, step=19320, lr=0.119750, batch loss=0.026432, epoch loss=0.037980 Batch=179, step=19380, lr=0.119500, batch loss=0.073533, epoch loss=0.111513 Batch=239, step=19440, lr=0.119250, batch loss=0.030395, epoch loss=0.141908 Batch=299, step=19500, lr=0.119000, batch loss=0.016819, epoch loss=0.158727 Batch=359, step=19560, lr=0.118750, batch loss=0.041902, epoch loss=0.200629 Batch=419, step=19620, lr=0.118500, batch loss=0.022443, epoch loss=0.223072 Batch=479, step=19680, lr=0.118250, batch loss=0.006422, epoch loss=0.229494 Batch=539, step=19740, lr=0.118000, batch loss=0.024124, epoch loss=0.253617 Batch=599, step=19800, lr=0.117750, batch loss=0.030988, epoch loss=0.284606 Batch=659, step=19860, lr=0.117500, batch loss=0.019993, epoch loss=0.304599 Batch=719, step=19920, lr=0.117000, batch loss=0.038536, epoch loss=0.343135 Batch=779, step=19980, lr=0.116750, batch loss=0.081045, epoch loss=0.424180 Batch=839, step=20040, lr=0.116750, batch loss=0.032475, epoch loss=0.456654 Batch=899, step=20100, lr=0.116500, batch loss=0.029631, epoch loss=0.486285 Batch=959, step=20160, lr=0.116250, batch loss=0.013387, epoch loss=0.499672 Batch=1019, step=20220, lr=0.116000, batch loss=0.017196, epoch loss=0.516868 Batch=1079, step=20280, lr=0.115750, batch loss=0.002740, epoch loss=0.519608 Batch=1139, step=20340, lr=0.115500, batch loss=0.015086, epoch loss=0.534694 Batch=1199, step=20400, lr=0.115250, batch loss=0.008370, epoch loss=0.543064 Epoch=16, step=20400, lr=0.115250, epoch loss=0.543064 Batch=59, step=20460, lr=0.115000, batch loss=0.004601, epoch loss=0.004601 Batch=119, step=20520, lr=0.114750, batch loss=0.012011, epoch loss=0.016612 Batch=179, step=20580, lr=0.114500, batch loss=0.026627, epoch loss=0.043239 Batch=239, step=20640, lr=0.114250, batch loss=0.015576, epoch loss=0.058815 Batch=299, step=20700, lr=0.114000, batch loss=0.003473, epoch loss=0.062288 Batch=359, step=20760, lr=0.113750, batch loss=0.013567, epoch loss=0.075856 Batch=419, step=20820, lr=0.113500, batch loss=0.016536, epoch loss=0.092392 Batch=479, step=20880, lr=0.113250, batch loss=0.006338, epoch loss=0.098729 Batch=539, step=20940, lr=0.113000, batch loss=0.015658, epoch loss=0.114388 Batch=599, step=21000, lr=0.112500, batch loss=0.020334, epoch loss=0.134722 Batch=659, step=21060, lr=0.112500, batch loss=0.016172, epoch loss=0.150894 Batch=719, step=21120, lr=0.112000, batch loss=0.031169, epoch loss=0.182064 Batch=779, step=21180, lr=0.111750, batch loss=0.047835, epoch loss=0.229899 Batch=839, step=21240, lr=0.111750, batch loss=0.027606, epoch loss=0.257504 Batch=899, step=21300, lr=0.111500, batch loss=0.027467, epoch loss=0.284972 Batch=959, step=21360, lr=0.111250, batch loss=0.015807, epoch loss=0.300778 Batch=1019, step=21420, lr=0.111000, batch loss=0.014146, epoch loss=0.314924 Batch=1079, step=21480, lr=0.110750, batch loss=0.002551, epoch loss=0.317475 Batch=1139, step=21540, lr=0.110500, batch loss=0.012746, epoch loss=0.330221 Batch=1199, step=21600, lr=0.110250, batch loss=0.005305, epoch loss=0.335526 Epoch=17, step=21600, lr=0.110250, epoch loss=0.335526 Batch=59, step=21660, lr=0.110000, batch loss=0.003115, epoch loss=0.003115 Batch=119, step=21720, lr=0.109500, batch loss=0.007218, epoch loss=0.010333 Batch=179, step=21780, lr=0.109500, batch loss=0.012861, epoch loss=0.023194 Batch=239, step=21840, lr=0.109000, batch loss=0.011471, epoch loss=0.034665 Batch=299, step=21900, lr=0.108750, batch loss=0.013780, epoch loss=0.048445 Batch=359, step=21960, lr=0.108750, batch loss=0.016486, epoch loss=0.064932 Batch=419, step=22020, lr=0.108500, batch loss=0.013336, epoch loss=0.078267 Batch=479, step=22080, lr=0.108250, batch loss=0.002950, epoch loss=0.081217 Batch=539, step=22140, lr=0.108000, batch loss=0.018186, epoch loss=0.099403 Batch=599, step=22200, lr=0.107750, batch loss=0.016373, epoch loss=0.115777 Batch=659, step=22260, lr=0.107500, batch loss=0.012220, epoch loss=0.127997 Batch=719, step=22320, lr=0.107250, batch loss=0.023808, epoch loss=0.151805 Batch=779, step=22380, lr=0.107000, batch loss=0.043195, epoch loss=0.195000 Batch=839, step=22440, lr=0.106750, batch loss=0.022297, epoch loss=0.217296 Batch=899, step=22500, lr=0.106250, batch loss=0.023005, epoch loss=0.240301 Batch=959, step=22560, lr=0.106250, batch loss=0.011337, epoch loss=0.251638 Batch=1019, step=22620, lr=0.106000, batch loss=0.009438, epoch loss=0.261076 Batch=1079, step=22680, lr=0.105750, batch loss=0.000088, epoch loss=0.261164 Batch=1139, step=22740, lr=0.105500, batch loss=0.010512, epoch loss=0.271676 Batch=1199, step=22800, lr=0.105250, batch loss=0.004319, epoch loss=0.275996 Epoch=18, step=22800, lr=0.105250, epoch loss=0.275996 Batch=59, step=22860, lr=0.104750, batch loss=0.001521, epoch loss=0.001521 Batch=119, step=22920, lr=0.104750, batch loss=0.006203, epoch loss=0.007723 Batch=179, step=22980, lr=0.104500, batch loss=0.010917, epoch loss=0.018640 Batch=239, step=23040, lr=0.104250, batch loss=0.011046, epoch loss=0.029686 Batch=299, step=23100, lr=0.104000, batch loss=0.009216, epoch loss=0.038903 Batch=359, step=23160, lr=0.103750, batch loss=0.012438, epoch loss=0.051340 Batch=419, step=23220, lr=0.103500, batch loss=0.011549, epoch loss=0.062890 Batch=479, step=23280, lr=0.103250, batch loss=0.004073, epoch loss=0.066963 Batch=539, step=23340, lr=0.103000, batch loss=0.014117, epoch loss=0.081080 Batch=599, step=23400, lr=0.102750, batch loss=0.014819, epoch loss=0.095899 Batch=659, step=23460, lr=0.102500, batch loss=0.014112, epoch loss=0.110011 Batch=719, step=23520, lr=0.102250, batch loss=0.018202, epoch loss=0.128214 Batch=779, step=23580, lr=0.102000, batch loss=0.018797, epoch loss=0.147010 Batch=839, step=23640, lr=0.101750, batch loss=0.028295, epoch loss=0.175305 Batch=899, step=23700, lr=0.101500, batch loss=0.021861, epoch loss=0.197166 Batch=959, step=23760, lr=0.101250, batch loss=0.009889, epoch loss=0.207055 Batch=1019, step=23820, lr=0.101000, batch loss=0.007378, epoch loss=0.214432 Batch=1079, step=23880, lr=0.100750, batch loss=0.001290, epoch loss=0.215722 Batch=1139, step=23940, lr=0.100500, batch loss=0.008679, epoch loss=0.224401 Batch=1199, step=24000, lr=0.100250, batch loss=0.004531, epoch loss=0.228933 Epoch=19, step=24000, lr=0.100250, epoch loss=0.228933 Half-moons scatterplot and decision boundary: ┌────────────────────────────────────────────────────────────────────────────────────────────────────┐ │********************************#*******************************************************************│ │**********************#*#*#######*###*#####*********************************************************│ │**********************#########################*****************************************************│ │*****************#**########*######*###########*###*************************************************│ │***************#################*###################************************************************│ │************######*#################*#################**********************************************│ │**********#*#####*########*#**************##*#########*#********************************************│ │***********########*##*#******************#*****##########******************************************│ │***********###########*************************############**************************************...│ │********######*####*********************************###*###*#*********************************......│ │*******######**##*************....*****************#*######*#*******************************........│ │*******##*##**##**********...........***************########*##***************************..........│ │*****#######************.......%...%%...***************#########************************..........%.│ │******######**********..........%........***************##*#####***********************.......%.%.%.│ │***#########**********.........%%%.%%......*************#*#######*********************.......%.%%%%.│ │****#######**********..........%%%%.........************#########*******************.........%%.%%.%│ │**#######***********...........%%%%%%%........*************###*###*****************..........%%%%%%.│ │*##*####************...........%%%%%%%.........***********########****************...........%%%%%%.│ │*#######***********............%%%%%%%..........************#######**************............%%%%%%.│ │*##*####***********............%%.%%%%%...........***********####***************............%%%%%%%.│ │*#####*#**********..............%%%%%%%............**********##*###***********...............%%%%%..│ │#######***********.............%.%%%%%%.............*********#######*********..............%%%%.%%..│ │#####*#**********...............%%%%%%%...............*******#######********...............%%%%%%%%.│ │###*#*#**********...............%%%%%%%%%..............*******######*******................%%%%%%...│ │#######*********.................%%%%%%%%................****###*###*****.................%%%%%%....│ │######*********..................%%%%%%%%%................***#*###******................%%%%%%%%%...│ │*#*##*#********...................%%%%%%%%%%...............***######***..................%%%%%%.....│ │#****##*******.....................%%%%%%%%%.................**###*#**................%.%%%%%%%.....│ │**************.....................%.%%%%%%...................******...................%.%%.%%......│ │*************........................%..%%%%%%%................****...............%.%%%%%%%%%.......│ │*************.........................%.%%%.%%%%.................*................%%%%%%%.%.%.......│ │************............................%..%%%%..%................................%%%%%%%%..........│ │***********..............................%%%%%%%%%%%........................%%..%%%%%%%%.%..........│ │***********..............................%%.%%%%%%%%..%....................%..%%%.%%%%%%%...........│ │**********..................................%%%%.%%%%%%%%...............%.%%%%%%%%%%%%.%............│ │**********...................................%%%%%%%%%%%%%%%%%%%%%%.%%%%.%%%%%%%%%%%%%..............│ │*********.....................................%%.%%%%%%%%%%%%%%%%%%%%%%.%%%%%%%%%%%.................│ │*********.........................................%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%...................│ │********.............................................%%%.%%%%%%%%%%%%%%%%%%%%%......................│ │********................................................%...%%%%.%%.%%%%..%.........................│ └────────────────────────────────────────────────────────────────────────────────────────────────────┘ "/usr/bin/env" "bash" "-c" "opam exec -- dune build @install @check @runtest && rm -rf _build" failed with exit status 1 2025-05-22 20:16.09: Job failed: Failed: Build failed