📜  辍学正则化 - 任何代码示例

📅  最后修改于: 2022-03-11 15:00:56.201000             🧑  作者: Mango

代码示例1
It works by randomly "dropping out" unit activations in a network for a single 
gradient step. The more you drop out, the stronger the regularization:

1-) 0.0 = No dropout regularization.
2-) 1.0 = Drop out everything. The model learns nothing.
3-) Values between 0.0 and 1.0 = More useful.