Skip to content

Commit b1e3d47

Browse files
committed
Fix README
1 parent db1b3db commit b1e3d47

File tree

1 file changed

+2
-3
lines changed

1 file changed

+2
-3
lines changed

README.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Cross-Entropy Loss is NOT What You Need
1+
# Harmonic Loss Trains Interpretable AI Models
22

33
This is the GitHub repository for the paper "Harmonic Loss Trains Interpretable AI Models" [[arXiv]]() [[Twitter]]() [[Github]](https://github.com/KindXiaoming/grow-crystals).
44

@@ -7,8 +7,7 @@ This is the GitHub repository for the paper "Harmonic Loss Trains Interpretable
77
## What is Harmonic Loss?
88
- Harmonic logit $d_i$ is defined as the $l_2$ distance between the weight vector $\mathbf{w}_i$ and the input (query) $\mathbf{x}$:  $d_i = \|\mathbf{w}_i - \mathbf{x}\|_2$.
99

10-
- The probability $p_i$ is computed using the harmonic max function:
11-
$p_i = \text{HarmonicMax}(\mathbf{d})_i \equiv \frac{1/d_i^n}{\sum_{j} 1/d_j^n},$ where $n$ is the **harmonic exponent**—a hyperparameter that controls the heavy-tailedness of the probability distribution.
10+
- The probability $p_i$ is computed using the harmonic max function: $p_i = \text{HarmonicMax}(\mathbf{d})_i \equiv \frac{1/d_i^n}{\sum_{j} 1/d_j^n},$ where $n$ is the **harmonic exponent**—a hyperparameter that controls the heavy-tailedness of the probability distribution.
1211

1312
- Harmonic Loss achieves (1) **nonlinear separability**, (2) **fast convergence**, (3) **scale invariance**, (4) **interpretability by design**, properties that are not available in cross-entropy loss.
1413

0 commit comments

Comments
 (0)