[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: *** Need help with a question. ***



>Hi,
>        I am taking a lisp class at brandeis and I have a question
>regarding back-propagation.  The question is:
>
>        Why does back-propagation require the threshold functions to
>be continuous and everywhere differentiable?
>
If you are using gradient descent for your methodology of error
correction/propogation, you have to have functions that you can
differentiate in order to find gradients.  However, this does not have to
be the case.  If your input space is not continuous, you can work the math
for the discrete case (and use the discrete version of gradient (I forget
the name) for your error correction/propogation function) and use discrete
functions for your thresholds.


Steven Dobbs <sdobbs@ai.uwf.edu>
Institute For Human and Machine Cognition, University of West Florida
Pensacola, Florida

Why isn't "phonetic" spelled the way it sounds?