Commit Graph

18 Commits

Author SHA1 Message Date
lucidrains
077d8c188f fix distill 2025-12-10 15:52:10 -08:00
lucidrains
5888f05300 1.16.4 2025-12-07 04:32:52 -08:00
lucidrains
dd6462d19b release small navit perf 2025-12-06 04:57:12 -08:00
lucidrains
3cff5e547a address https://github.com/lucidrains/vit-pytorch/issues/352 2025-12-02 05:21:52 -08:00
lucidrains
fdaf7f92b9 fix positional embed for mean pool case and cleanup 2025-11-27 17:01:47 -08:00
lucidrains
0ebd4edab9 address https://github.com/lucidrains/vit-pytorch/issues/351 2025-11-27 06:07:43 -08:00
lucidrains
aa49c2783a VAAT should have two ears 2025-11-22 08:32:23 -08:00
lucidrains
6aa0374313 register tokens for the AST in VAAT 2025-11-22 08:12:01 -08:00
lucidrains
b35a97de05 improvise a variant of VAT with audio cortex before fully generalizing it 2025-11-22 07:51:19 -08:00
lucidrains
1374b93145 the paper claims finetuning everything was better, but just allow for freezing the visual cortex, what PI proposes 2025-11-09 10:59:55 -08:00
lucidrains
4386742cd1 an option to return zero for decorr aux loss if insufficient samples 2025-11-09 10:08:06 -08:00
lucidrains
5cf8384c56 add a vit with decorrelation auxiliary losses for mha and feedforwards, right after prenorm - this is in line with a paper from the netherlands, but without extra parameters or their manual sgd update scheme 2025-10-28 12:17:32 -07:00
lucidrains
f7d59cecb5 some register tokens cannot hurt for VAT 2025-10-24 14:00:38 -07:00
lucidrains
a583cb5988 last tweak to vat 2025-10-23 12:21:09 -07:00
lucidrains
25871013f5 forgot task conditioning for vat 2025-10-23 10:55:16 -07:00
lucidrains
e66862bcd5 add VAT from iclr 2026, which claims SOTA on libero using a relatively simple scheme (#350) 2025-10-23 10:23:53 -07:00
lucidrains
39fd9ac8be for n-dimensional vit, have a method for fetching muon friendly parameters 2025-10-13 12:07:48 -07:00
lucidrains
3becf087bb have a language model address https://github.com/lucidrains/vit-pytorch/issues/348 2025-09-25 06:21:13 -07:00