lucidrains
|
3cff5e547a
|
address https://github.com/lucidrains/vit-pytorch/issues/352
|
2025-12-02 05:21:52 -08:00 |
|
roydenwa
|
9d43e4d0bb
|
Add ViViT variant with factorized self-attention (#327)
* Add FactorizedTransformer
* Add variant param and check in fwd method
* Check if variant is implemented
* Describe new ViViT variant
|
2024-08-21 19:23:38 -07:00 |
|
Phil Wang
|
8208c859a5
|
just remove PreNorm wrapper from all ViTs, as it is unlikely to change at this point
|
2023-08-14 09:48:55 -07:00 |
|
Phil Wang
|
3e5d1be6f0
|
address https://github.com/lucidrains/vit-pytorch/pull/274
|
2023-08-09 07:53:38 -07:00 |
|
Phil Wang
|
ce4bcd08fb
|
address https://github.com/lucidrains/vit-pytorch/issues/266
|
2023-05-20 08:24:49 -07:00 |
|
Phil Wang
|
5699ed7d13
|
double down on dual patch norm, fix MAE and Simmim to be compatible with dual patchnorm
|
2023-02-10 10:39:50 -08:00 |
|
Phil Wang
|
6ec8fdaa6d
|
make sure global average pool can be used for vivit in place of cls token
|
2022-10-24 19:59:48 -07:00 |
|
Phil Wang
|
13fabf901e
|
add vivit
|
2022-10-24 09:34:04 -07:00 |
|