Skip to content

Conversation

lockwo
Copy link

@lockwo lockwo commented Feb 10, 2025

A bit later than I meant to, but this addresses #1059. Pretty straightforward change to equinox style NNs. Some minor speed differences that could be optimized (see patrick-kidger/equinox#928, patrick-kidger/equinox#926), but IMO it's fast enough. This PR also depends on the following PR to equinox to update its batch norm code to be more standard with other jax libraries: patrick-kidger/equinox#948.

Full report comparison haiku and equinox should be available here (note, the results for Haiku were just done on main, also for minatar I used 1xA100 and for AZ I used 4xA100): https://api.wandb.ai/links/rl-exps/fak1nv1n

@lockwo lockwo changed the title Change example's away from Haiku Change examples away from Haiku Feb 12, 2025
@sotetsuk
Copy link
Owner

Awesome! 🤩 Thank you for your PR! I'll check it this weekend 🙏

@hyu2000
Copy link

hyu2000 commented Sep 1, 2025

hey guys, I'm glad I found this, as I was struggling to migrate pgx-alphazero code to equinox..
Is this to be merged soon?

@lockwo the wandb link seems to have expired. I'm curious about your go-9x9 run. Do you happen to have any details somewhere else? Thanks

@lockwo
Copy link
Author

lockwo commented Sep 2, 2025

Oops, I think I cleaned out that WB account recently. I can see about re-running the experiments if needed. For the 9x9 go, there just so happens to be a screenshot of the results in the old equinox PR (patrick-kidger/equinox#948, just look at the batch norm that worked lol since that's the one in use now).

Code wise, it's the exact same as in the PR (no special sauce beyond that)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants