All you need is time and GPU
Try to allocate time for these thought provoking Deep Learning papers. Part of them with try it yourself implementation at GitHub.
1. Try it yourself at home or anywhere at all (with GPU)
Transformer more than meet the eye!
– A novel approach to language understanding from Google Brain(via David Ha)
It is a very interesting solution for an old linguistic/ syntactic challenge (anaphora) with Deep Learning. More detailed explanation of anaphora resolution.
– Based on “Attention is all you need” paper
2. Learning To Remember Rare Events
An interesting approach to introduce memory module into various types of Deep Learning architectures to provide them with life long learning.
3. One Model To Learn Them All
A unified Deep Learning model that is capable of being applied to inputs from various modalities. It is a one step closer toward general DL architectures.
4. Meet Fashion-MNIST
Finally, it is time to ditch MNIST in favor of Fashion-MNIST
Which is better from a number of aspects. Which one? Find yourself.
**Note:
If you haven’t noticed the one thing in common to all of these items except for one is
Łukasz Kaiser researcher from Google Brain.