The current training guide targets the v0.9 Windows executable that ships with built-in ML-Agents.
A forthcoming v1.0 (July,1 2025) will provide a stand-alone Python API (no ML-Agents) + headless Linux builds for cluster training

Follow the numbered sections below to install dependencies, launch training, add your own model, and locate results.

1 · Download Environment

Download environment MouseVsAI_Windows_v0.9.zip

2 · Install Miniconda

† Skip this section if you already have Anaconda or Miniconda.

a) Download the installer:

      curl -o Miniconda3-latest-Windows-x86_64.exe ^
           https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe
   

b) Silent install

      start /wait "" Miniconda3-latest-Windows-x86_64.exe ^
     /InstallationType=JustMe /AddToPath=1 /RegisterPython=1 /S ^
     /D=%USERPROFILE%\Miniconda3
   

c) Activate

   %USERPROFILE%\Miniconda3\Scripts\activate
   

Check installation:

   conda --version
   

3· Create and activate the training environment

   cd <folder‑with‑exe-and-mouse.yml>
   conda env create -n mouse2 -f mouse.yml
   conda activate mouse2
   

One‑time path fix Open train.py and replace the placeholder path to encoders.py with the actual path inside your environment, e.g. C://Miniconda3/envs/mouse2/Lib/site-packages/mlagents/trainers/torch/encoders.py

4· Run the training script

   python start.py
   

The script prints usage:

   Usage: python start.py [train|test] [options]
     --runs-per-network R
     --run-id ID
     --networks N1,N2,N3   (fully_connected, nature_cnn, simple, resnet)
   

The script prints usage:

   python -u start.py train ^
          --runs-per-network 1 ^
          --run-id Normal ^
          --network neurips,simple,fully_connected
   

5· Customise the model

Add your custom encoder to the Encoders/ directory. Optionally tweak hyper‑parameters in nature.yml (keep vis_encode_type: nature_cnn). Re‑run the command from § 4.