1 Matching Annotations
- Apr 2016
-
cs231n.github.io cs231n.github.io
-
Effect of step size. The gradient tells us the direction in which the function has the steepest rate of increase, but it does not tell us how far along this direction we should step.
That's the reason why step size is an important factor in optimization algorithm. Too small step can cause the algorithm longer to converge. Too large step can cause that we change the parameters too much thus
overstepped
the optima.
-