DATA_310_Notes

Notes and code from Data_310 Lectures

View the Project on GitHub c-hibbert/DATA_310_Notes

Week 4

July 28 2020

Word Embeddings

Question 1

Quesiton 2

Text Classification with an RNN

Question 1

Without LSTM

With LSTM

July 29

Using NLP to build a sarcasm classifier

Text generation with an RNN

The following is the Shakespeare text generated after two runs

ROMEO: I may at
Corapiless vial tempisartly,
I'll do him ashes not the mind of your honour.

PETRUCHIO:
Who cousins here?
That stand within the well-comfort in my blow his greeful day: the rest, which bid
It were the England confusion of des
Alas, Montague it pleasure sister,
Thy ends for proud my books. Where is her eye,
Thene were an edmand great are services stend us.

VIRGILIA:
I am the king, who have not die to seiz upon thy sword:
Would the people, of a vilour will shall
Most sugden his substance as ce
Our dread soul venture would tell this most self,
Nor chances after down by what's the port,
My house of yourselves draw justice
Murches the buttafforced. Ho must nature,
Though she would make your design,
If time my mighty stock sadches wearselfea-need him.
Would on home. Your aid,
I go fortain'd for your majesty.

KING HENRY VI:
For that cannot a high absent born. But why, being love thy hand,
I will be clofk.

CLIUF:
Therefore, buring in death; my girl and thine express, he looks any s

Allowing the RNN to process twice as it goes forward and backwards through a bidirectional keras layer allows the input to take in and learn large heaps of information in a timely manner and reiterate the results ina more accurate way.

Neural machine translation with attention

The three translations are as follows

In the end, the translations came out fairly well, very impressive and I’m glad it worked (especially after how long it took to run [about 4 hours of training])!