For anyone in the same boat, this will fix it: ``` state_dict = torch.load(f'{torch.hub.get_dir()}\checkpoints\xception-43020ad28.pth') state_dict['last_linear.weight'] = state_dict.pop('fc.weight') state_dict['last_linear.bias'] = state_dict.pop('fc.bias') ``` And you can save the "new" model under the same name so you don't have to do this more than once with: ``` torch.save(state_dict, f'{torch.hub.get_dir()}/checkpoints/xception-43020ad28.pth') ```