[论文精读]LoRA: Low-Rank Adaptation of Large Language Models
LoRA: Low-Rank Adaptation of Large Language Models http://arxiv.org/abs/2106.09685 低秩矩阵训练大模型,有效减小开销、即插即用、对内部参数的变化不大,不得不学。 lora框架冻结预训练模型的权重并插入可训练的秩分解矩阵到Transformer架构的每…
2025-05-01