Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update softmax-regression.md #1369

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion chapter_linear-networks/softmax-regression.md
Original file line number Diff line number Diff line change
@@ -309,7 +309,7 @@ $$H[P] = \sum_j - P(j) \log P(j).$$
## 练习

1. 我们可以更深入地探讨指数族与softmax之间的联系。
1. 计算softmax交叉熵损失$l(\mathbf{y},\hat{\mathbf{y}})$的二阶导数。
1. 计算softmax交叉熵损失$l(\mathbf{y},\hat{\mathbf{y}})$相对于未规范化的预测$o_j$的二阶导数。
1. 计算$\mathrm{softmax}(\mathbf{o})$给出的分布方差,并与上面计算的二阶导数匹配。
1. 假设我们有三个类发生的概率相等,即概率向量是$(\frac{1}{3}, \frac{1}{3}, \frac{1}{3})$。
1. 如果我们尝试为它设计二进制代码,有什么问题?