Back to archive

Thread

2 tweets

1
Ah, I was under the impression that in LoRA you keep the weight matrices but have a low-rank update for fine-tuning, but maybe I misunderstood. Yeah, OK, the matrices are not like really low-rank, but maybe low-rank "enough" to do some compression...
2
@rasbt OK, but yeah, if it doesn't immediately ring a bell it probably hasn't been published (and probably doesn't work) :)