Skip to main content

Artificial Intelligence Pens Shakespeare Sonnet!

Hey guys! I recently came across one excellent poetry algorithm named Deep-Speare. It was developed by  A four-person team from the School of Information and the Graduate School of Education designed a computer algorithm  which was able to successfully fool people trying to distinguish between human- and bot-written verses nearly 50 per cent of the time. 

However, experts could still easily identify machine-generated poetry, and AI may have a long way to go before it can outdo Shakespeare, researchers said. 

Computer scientists at University of Melbourne in Australia and University of Toronto in Canada designed an algorithm that writes poetry following the rules of rhyme and metre. 

In some ways, the computer's verses were better than Shakespeare's. The rhymes and metre in the machine-generated poetry were more precise than in human-written poems.


Test Run Feedback 

The following excerpt is courtesy Economic Times:

"It's very easy for me to tell what's by a computer or not - ridiculously easy," said Adam Hammond, an assistant professor at University of Toronto. 

"We solved two out of four problems," Hammond says, referring to rhyme and metre. 

"The other two are much harder: making something that's readable and something that can evoke emotion in a reader," he said. 

Researchers trained a neural network using nearly 2,700 sonnets from a free digital library. 

The computer uses three models - language, metre and rhyming - and probability to pick the right words for its poem. It produced quatrains, or four lines of verse, in iambic pentameter. 

The researchers assessed their results by asking people online to tell the human and algorithm poetry apart. 


Most laymen could not tell that verses like this one were the work of a programmed poet: "With joyous gambols gay and still array/ No longer when he twas, while in his day/ At first to pass in all delightful ways/ Around him, charming and of all his days." 


Journey behind the algorithm


One of the two lines has been written by Shakespeare, while the other is penned down by Deep-spear. I would let you guess who wrote which.

Utilizing this trove of rhyming words, the Pythonic Poet composes poems in reverse, beginning each line with the last word, as per the rhyme conspire. It at that point includes the former word, constructing its choice with respect to the likelihood that the new word would show up before the one that presently tails it in the first gathering of poems. This procedure proceeds, with the PC considering up to three words at any given moment, until the point that the line achieves 10 syllables. 

To enable the pieces to feel more durable, the group trained the calculation to compose representations. To start with, they bolstered it 2,860 American lyrics and had it distinguish normal subjects. At the point when a client kickstarts the calculation by contributing a thing like "heart," the calculation picks a subject, picking one that isn't too firmly related—like "bitterness," rather than the more unsurprising "love." Paul and Gagliano say their greatest test was making ballads that streamed and had a topic, without being excessively self-evident. To measure the closeness of the relationship, the PC utilizes "word vectors," which speak to words in numerical terms. As the calculation composes the poem, it favors words that fall numerically somewhere close to the first thing (heart) and the subject (misery). 

In May, the collective endeavors' earned second place in Dartmouth's first-historically speaking PoetiX rivalry, which assesses piece generators utilizing the Turing test, a great measure of man-made consciousness. Judges read six works composed by people and four created by machines (two from every one of the two contenders). Albeit every one of the poems submitted met the specialized parameters, none of the judges were tricked by the machine-made verses.

So, with 2 more parameters left to tune, I believe that we could be looking at yet another wonder of Deep Learning at the threshold!

Comments

Total Pageviews

Popular posts from this blog

Kaggle Dataset Analysis : Is your Avocado organic or not?

Hey readers! Today, allow me to present you yet another dataset analysis of a rather gluttony topic, namely Avocado price analysis. This Data set  represents the historical data on avocado prices and sales volume in multiple US markets. Our prime objectives will be to visualize the dataset, pre-process it and ultimately test multiple sklearn classifiers to checkout which one gives us the best confidence and accuracy for our Avocado's Organic assurance! Note : I'd like to extend the kernel contribution to Shivam Negi . All this code belongs to him. Data Visualization This script must procure the following jointplot  While a similar joint plot can be drawn for conluding the linearly exponent relations between extra large bags and the small ones. Pre Processing The following script has been used for pre processing the input data. Model Definition and Comparisons We will be looking mostly at three different models, namely ran

Your help in Fashion : 7 layer CNN at your service (~92% accurate)

Hey Readers! Welcome to yet another post where I play with a self designed neural network. This CNN would be tackling a variant of classical MNIST known as Fashion MNIST dataset  . Before we start exploring what is the approach for this dataset, let's first checkout what this dataset really is. Fashion-MNIST is a dataset of Zalando's article images—consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes. Zalando intends Fashion-MNIST to serve as a direct drop-in replacement for the original MNIST dataset for benchmarking machine learning algorithms. It shares the same image size and structure of training and testing splits. The original MNIST dataset contains a lot of handwritten digits. Members of the AI/ML/Data Science community love this dataset and use it as a benchmark to validate their algorithms. In fact, MNIST is often the first dataset researchers try. &q

IOT Breakthrough : TensorFlow 1.9 Officially Supports the Raspberry Pi

Hey Readers! Good news for all the "cheap fair power" computer fans, as a result of a major collaboration effort between TensorFlow and Raspberry Pi foundation, one can now install tensorflow precompiled binaries using Python's pip package system !  When TensorFlow was first launched in 2015, they wanted it to be an “ open source machine learning framework for everyone ”. To do that, they needed to run on as many of the platforms that people are using as possible. They have long supported Linux, MacOS, Windows, iOS, and Android, but despite the heroic efforts of many contributors, running TensorFlow on a Raspberry Pi has involved a lot of work. If one is using Rasbian9 they can simply use these 2 commands to install tensorflow on their machine! According to an excerpt from TensorFlow's medium article page :  " We’re excited about this because the Raspberry Pi is used by many innovative developers, and is also widely used in education to