Skip to content

added comment explaining group norm behavior in ggml.h#813

Open
balisujohn wants to merge 1 commit intoggml-org:masterfrom
balisujohn:dev-groupnorm-comment
Open

added comment explaining group norm behavior in ggml.h#813
balisujohn wants to merge 1 commit intoggml-org:masterfrom
balisujohn:dev-groupnorm-comment

Conversation

@balisujohn
Copy link
Copy Markdown
Contributor

Adds a comment explaining behavior described in this issue: #803

@slaren
Copy link
Copy Markdown
Member

slaren commented May 6, 2024

@leejet since you wrote ggml_group_norm, can you review if this documentation is accurate?

@leejet
Copy link
Copy Markdown
Contributor

leejet commented May 14, 2024

The ggml_group_norm is not calculated over the first 2 dimensions, but rather splits ne2 (which is equivalent to num_channels in PyTorch) into n_groups (equivalent to num_groups in PyTorch) groups and then calculates along each group and dimensions ne0 and ne1. Since most scenarios using group norm involve 3-dimensional or 4-dimensional tensors, I simply assumed the group partitioning occurs along ne2. I've previously commented in the API, // group normalize along ne0*ne1*n_groups but in reality, it should be // group normalize along ne0*ne1*(ne2/n_groups).

@leejet
Copy link
Copy Markdown
Contributor

leejet commented May 14, 2024

Here is a simple schematic diagram: ne0 => W, ne1 => H, ne2 => C, ne3 => N.

norm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants