Skip to content

Commit f056625

Browse files
committed
Fix README
1 parent a280dc1 commit f056625

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ This is the GitHub repository for the paper "Harmonic Loss Trains Interpretable
77
## What is Harmonic Loss?
88
- Harmonic logit $d_i$ is defined as the $l_2$ distance between the weight vector $\mathbf{w}_i$ and the input (query) $\mathbf{x}$:  $d_i = \|\mathbf{w}_i - \mathbf{x}\|_2$.
99

10-
- The probability $p_i$ is computed using the harmonic max function: $$p_i = \text{HarmonicMax}(\mathbf{d})_i \equiv \frac{1/d_i^n}{\sum_{j} 1/d_j^n},$$ where $n$ is the **harmonic exponent**—a hyperparameter that controls the heavy-tailedness of the probability distribution.
10+
- The probability $p_i$ is computed using the harmonic max function: $$p_i = \textrm{HarmonicMax}(\mathbf{d})_i \equiv \frac{1/d_i^n}{\sum_{j} 1/d_j^n},$$ where $n$ is the **harmonic exponent**—a hyperparameter that controls the heavy-tailedness of the probability distribution.
1111

1212
- Harmonic Loss achieves (1) **nonlinear separability**, (2) **fast convergence**, (3) **scale invariance**, (4) **interpretability by design**, properties that are not available in cross-entropy loss.
1313

0 commit comments

Comments
 (0)