osc.models.attentions

Self, cross, co, and slot attentions.

Attention modules implement the query-key-value projections, the attention itself, and the output projections.

Block modules wrap an attention module with layer norm, feed-forward layers and residual connections.

Classes

CoAttentionBlock

Co-attention block, both a->b and b->a.

CrossAttention

Cross attention.

CrossAttentionBlock

Cross attention block.

CrossAttentionDecoder

Init.

SelfAttention

Self attention.

SelfAttentionBlock

Self attention block.

SlotAttention

Slot attention.