-
Notifications
You must be signed in to change notification settings - Fork 35
Change examples away from Haiku #1300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Awesome! 🤩 Thank you for your PR! I'll check it this weekend 🙏 |
hey guys, I'm glad I found this, as I was struggling to migrate pgx-alphazero code to equinox.. @lockwo the wandb link seems to have expired. I'm curious about your go-9x9 run. Do you happen to have any details somewhere else? Thanks |
Oops, I think I cleaned out that WB account recently. I can see about re-running the experiments if needed. For the 9x9 go, there just so happens to be a screenshot of the results in the old equinox PR (patrick-kidger/equinox#948, just look at the batch norm that worked lol since that's the one in use now). Code wise, it's the exact same as in the PR (no special sauce beyond that) |
A bit later than I meant to, but this addresses #1059. Pretty straightforward change to equinox style NNs. Some minor speed differences that could be optimized (see patrick-kidger/equinox#928, patrick-kidger/equinox#926), but IMO it's fast enough. This PR also depends on the following PR to equinox to update its batch norm code to be more standard with other jax libraries: patrick-kidger/equinox#948.
Full report comparison haiku and equinox should be available here (note, the results for Haiku were just done on
main
, also for minatar I used 1xA100 and for AZ I used 4xA100): https://api.wandb.ai/links/rl-exps/fak1nv1n