TORCHMIX 🧩
GitHubGitHub (opens in a new tab)
  • Introduction
  • Component Class
  • Global Configurations
  • Config-only Mode
  • Examples
    • ViT
    • BERT
    • GPT
  • Components
    • Attention
    • WindowAttention
    • Feedforward
    • PositionalEmbedding
    • SinusoidalEmbedding
    • VocabEmbedding
    • PatchEmbedding
    • ClassEmbedding
    • AvgPool
    • ClassPool
    • PatchMerge
    • Add
    • Mul
    • Attach
    • Dropout
    • DropPath
    • StochasticDepth
    • PreNorm
    • PostNorm
    • Repeat
  • Plugins
    • CausalMask
    • DropAttention
    • DropProjection
    • RelativePositionBias
    • RelativePositionBiasViT
    • RotaryEmbedding
    • SubNorm
    • DropActivation
    • DropProjectionIn
    • DropProjectionOut
    • Transpose
  • Introduction
  • Component Class
  • Global Configurations
  • Config-only Mode
  • Examples
    • ViT
    • BERT
    • GPT
  • Components
    • Attention
    • WindowAttention
    • Feedforward
    • PositionalEmbedding
    • SinusoidalEmbedding
    • VocabEmbedding
    • PatchEmbedding
    • ClassEmbedding
    • AvgPool
    • ClassPool
    • PatchMerge
      • Forward
    • Add
    • Mul
    • Attach
    • Dropout
    • DropPath
    • StochasticDepth
    • PreNorm
    • PostNorm
    • Repeat
  • Plugins
    • CausalMask
    • DropAttention
    • DropProjection
    • RelativePositionBias
    • RelativePositionBiasViT
    • RotaryEmbedding
    • SubNorm
    • DropActivation
    • DropProjectionIn
    • DropProjectionOut
    • Transpose

On This Page

  • Forward
Question? Give us feedback → (opens in a new tab)Edit this page
Components
PatchMerge

PatchMerge

Patch merging layer from Swin-Transformer (opens in a new tab).

PatchMerge(dim=96, expansion_factor=2.0)

Forward

(x: jaxtyping.Float[Tensor, '... n d_in']) -> jaxtyping.Float[Tensor, '... n/4 d_out']
ClassPoolAdd