Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

purpose of batch norm controller #65

Copy link
Copy link
@hzhz2020

Description

@hzhz2020
Issue body actions

Dear Authors

Thanks for providing this great repo!
You mentioned in your FlexMatch paper that a batch norm controller is introduced in the codebase to prevent performance crashes for some algorithms. You mentioned that Mean Teacher, Pi-model and MixMatch might be unstable if update Batchnorm for both labeled and unlabeled data in turn. Does this have to do with multi-GPU training? (if I use a single GPU, will this instability persist?)

Alternatively, can i simply freeze the batchnorm when forwarding the unlabeled batch?

Hope to hear from you. Thanks!

Reactions are currently unavailable

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.