Recently Published

Estaciones pluviométricas Mexico
Plataforma que muestra las estaciones pluviométricas de México.
RNN with GRU
One common issue with Recurrent Neural Networks is exploding or vanishing gradients, where the previous state (h(t-1)) and its weight become extremely high, or more commonly are reduced to practically zero during backpropagation. Since we have a lot of data points (1440), this should be addressed. One way to address this is by implementing a Gated Recurrent Unit, which changes the hidden state equation to include an update gate and a reset gate. The update gate balances how much of the new hidden state to incorporate, compared to the previous hidden state. The reset gate sets how much past information to forget when computing the new hidden state. The rest of this model ran similarly to our True RNN, with 10 epochs, measuring MSE and MAE.
EXAMEN
Rajalaxmi Pradhan
Done
Document
EXAMEN-SEMESTRAL
Cetacean excerpt from essay Introduction
The subsection introducing the broad idea of odontocetes as equal partners with humans in earth's ongoing experiment with high-level consciousness.
Document
Done
deepak
done