How we help Ukraine
#StandWithUkraine
ML and AI, SOLVVE

Text Generation with Google OpenAI

A screenshot of SOLVVE Text generator

The machine learning department at SOLVVE continues to experiment with different tools and technologies like it did with object recognition as well as score and text recognition. This time, we would like to introduce you to text generation tools, their principles of work, and their potential applications in business. Below you will also be able to generate your own texts with our tool.

How does it work?

At the core of this solution is a neural network from Google called GPT-2 that has two distinct capabilities. Its first peculiarity lies in the number of parameters it uses to generate text. Compared to the average of 100-300 million parameters, GPT-2 has about 1.5 billion of them. It allows generating big amounts (we are talking pages) of coherent texts of rather high quality. 

At the core of this technology lies the ability of GPT-2 to “foresee” the words that should come next in the sentence. Simply speaking, the word choice depends on the frequency of their usage in different phrases. For example, if you start a sentence with “Let me…”, the system will follow it with “…help you” or “…ask you”, “…show you”, and so on. The system will choose the one that has the highest rate of usage and will fit the context. And so it continues to build pages of text. You can see an example of input and output text here.

Secondly, GPT-2 was able to deliver such high-quality results without additional training of the model. It’s unusual since language models need additional training to meet the requirements of a specific task. Nevertheless, it doesn’t mean that additional training should not be done. While GPT-2 performed exceptionally well compared to other models, when it comes to potential applications, especially commercials, additional training is a must to meet your goals. 

What are the potential applications of text generation?

First of all, this model is helpful for text generation of different complexity and with a different tone of voice. Those are helpful if, for example, you need to produce content of a predefined type: comments on social media, descriptions for inventory in online shops, articles for a predefined range of topics, and so on.

Secondly, GPT-2 can summarize large bodies of texts. This one might be useful to save man-hours on delivering reports or image captions. Thirdly, this model also proved itself to be helpful in unsupervised translations and speech recognition.

As for the possibilities of future improvements to this model and their respective potential applications, one can try to experiment with input/output data with the fullest version of GPT-2 or with the newer GPT-3 model. It is also possible to deliver services that would allow others to additionally train the model. For example, you can make your model adhere to certain stylistics by additionally training on a data set of your choice. Feed the model all of Tolstoy’s works and get texts written with the same tone of voice.

Lastly, though it is unfortunate, these capabilities of GPT-2 might also create malicious content by generating fake news, posting hateful comments, spamming, impersonation, and other unethical actions that have to do with content production.

I would like to have this technology for my business. SOLVVE machine learning department has created a comment generator using GPT-2. You can try it for yourself here

The machine learning department at SOLVVE already has hands-on experience with this instrument and can help you plan, develop and deploy text generation features for your product or application. If you have any questions or ideas about using machine learning in your business, do not hesitate to contact us. Let us make it happen!