RNNs (Recurrent Neural Networks) are a class of Models used for seq2seq (Sequence to Sequence) Problems. RNNs are typically useful in situations where the outputs also depend on previous computations. Another way to think of it is that they have "memory" which captures information about what has been calculated so far. Join us for a quick speedy introduction to RNNs by building a basic Full Adder.
RNNs (Recurrent Neural Networks) are a class of Models used for seq2seq (Sequence to Sequence) Problems. RNNs are typically useful in situations where the outputs also depend on previous computations. Another way to think of it is that they have "memory" which captures information about what has been calculated so far. Join us for a quick speedy introduction to RNNs by building a basic Full Adder where the model essentially learns to add the hidden "carry bit" by itself.
Talk Content: 1) Introduction to Neural Networks and the RNN model. 2) Problems with the initial RNN models and Introduction to LSTMs. (Long Short Term Memory Networks) 3) Defining the Full Adder Problem + Expectations from the Model. 4) Basics of PyTorch (With a Brief History) 5) Code Session : Implementation of an actual LSTM based RNNs for learning Full Adder on bit-strings.