Skip to content

Xception pretrained state_dict keys no longer match the parameter names in the xception class  #227

@jiversivers

Description

@jiversivers

For anyone in the same boat, this will fix it:

state_dict = torch.load(f'{torch.hub.get_dir()}\checkpoints\xception-43020ad28.pth')
state_dict['last_linear.weight'] = state_dict.pop('fc.weight')
state_dict['last_linear.bias'] = state_dict.pop('fc.bias')

And you can save the "new" model under the same name so you don't have to do this more than once with:

torch.save(state_dict, f'{torch.hub.get_dir()}/checkpoints/xception-43020ad28.pth')

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions