A Digest of Deep Learning Pearls

All you need is time and GPU

Try to allocate time for these thought provoking Deep Learning papers. Part of them with try it yourself implementation at GitHub.

1. Try it yourself at home or anywhere at all (with GPU)

Transformer more than meet the eye!
– A novel approach to language understanding from Google Brain(via David Ha)
It is a very interesting solution for an old linguistic/ syntactic challenge (anaphora) with Deep Learning. More detailed explanation of anaphora resolution.
– Based on “Attention is all you needpaper

2. Learning To Remember Rare Events

An interesting approach to introduce memory module into various types of Deep Learning architectures to provide them with life long learning.

3. One Model To Learn Them All

A unified Deep Learning model that is capable of being applied to inputs from various modalities. It is a one step closer toward general DL architectures.

4. Meet Fashion-MNIST

Finally, it is time to ditch MNIST in favor of Fashion-MNIST

Which is better from a number of aspects. Which one? Find yourself.

**Note:

If you haven’t noticed the one thing in common to all of these items except for one is
Łukasz Kaiser researcher from Google Brain.

 Java Code Geeks

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.