Texts generation with an RNN, Tensorflow and OpenAI GPT-2

Below you can find a collection of short stories and dialogues generated using an RNN, Tensorflow and GPT-2 to create short novels about relationship.

If you are a writer, a developer or just curious or interested please get in touch with us and we can discuss how to collaborate together.

The process of creation:
We are teaching a machine-learning algorithm how to write about Love in our style ( cheesy, ironic and non-sense) using the code available in GitHub and TensorFlow, a machine learning library made by Google.
Then, our team reviews and publishes these instances of Deep Writing as stories or dialogues.

 

 

As next steps, we are looking to define our next project using Google magenta to create our new Love song “You Will Know Love”. Plus we have to study IBM Watson chatbot. Anyone needs proper customer service.

While we are creating new novels, you can contact OpenAI which is interested in collaborating with researchers working on language model output detection, bias, and publication norms, and with organizations potentially affected by large language models....

We recommend using the open-source Anaconda Distribution, which is the easiest way to perform Python/R data science and machine learning on Linux, Windows, and Mac OS X....

Our creative process is quite simple: first Training RNN, review and edit the Sample and as the final step run a conditional sample generation in GPT-2 until we have a coherent text....

During our search regarding Multi-layer Recurrent Neural Networks, we have discovered the work of OpenAI and their language model called GPT-2 that generates coherent paragraphs of text one word at a time...

We have deiced to focus only on one topic to training our Multi-layer Recurrent Neural Networks RNN. We have chosen Love and relationship as the main concept to write our short novels...