jukofyork commited on
Commit
90bf200
·
verified ·
1 Parent(s): 138a0cb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -97,7 +97,7 @@ whereas the general (non-orthogonal) "multiplicative-LoRA" method can do this by
97
 
98
  In general, the way to think about these (non-orthogonal) "multiplicative-LoRAs" is as a kind of "conditional control-vector":
99
 
100
- - Each vector in `lora_A` looks for a certain dirrection, and via the dot-product it generates (signed) weighting factor that measures the similarity between the output of the `down_proj` transformation.
101
  - Each corresponding vector in `lora_B` then gets added to the hidden state / residual stream based on the corresponding weighting factor.
102
 
103
  So instead of having just a single vector that we add (in essence we add a bias term and create an [affine transformation](https://en.wikipedia.org/wiki/Affine_transformation)), we now have many different control vectors that can be added (stored in `lora_B`), based on how well they match another set of "directional detection vectors" (stored in `lora_A`).
 
97
 
98
  In general, the way to think about these (non-orthogonal) "multiplicative-LoRAs" is as a kind of "conditional control-vector":
99
 
100
+ - Each vector in `lora_A` looks for a certain dirrection, and via the dot-product it generates a (signed) weighting factor that measures the similarity between the output of the `down_proj` transformation.
101
  - Each corresponding vector in `lora_B` then gets added to the hidden state / residual stream based on the corresponding weighting factor.
102
 
103
  So instead of having just a single vector that we add (in essence we add a bias term and create an [affine transformation](https://en.wikipedia.org/wiki/Affine_transformation)), we now have many different control vectors that can be added (stored in `lora_B`), based on how well they match another set of "directional detection vectors" (stored in `lora_A`).