osc.models.attentions
Self, cross, co, and slot attentions.
Attention modules implement the query-key-value projections, the attention itself, and the output projections.
Block modules wrap an attention module with layer norm, feed-forward layers and residual connections.
Classes
Co-attention block, both a->b and b->a. |
|
Cross attention. |
|
Cross attention block. |
|
Init. |
|
Self attention. |
|
Self attention block. |
|
Slot attention. |