OpenAI AI Learned to Write Poems, Articles, and News



    Despite the fact that chatbots still do not support conversations with people too well (although they are constantly improving in this), they work much better with text. You can verify this statement with the help of a web application, the basis for which is artificial intelligence (its weak form).

    So, if the user begins to write a news article, the bot can complete it. Also, the technology relatively well supports "communication" with a person through correspondence. If you ask "What should I do today?", The program will give a completely intelligible answer. There is also a solution in the form of a web service, this is TalkToTransformer.com .



    Designed by Canadian engineer Adam King. It is worth noting that he created the external part of the service, but based on it is an AI developed by the research organization OpenAI. Earlier this year, OpenAI introduced its own language AI system, GPT-2, and TalkToTransformer is an opportunity to try out this system.

    Previously, it was available only for testing scientists and journalists selected by developers. A service is called a “Transformer" according to the type of neural network that underlies GPT-2 .



    If you want to get acquainted with the language capabilities of AI, then there is no better option than TalkToTransformer. The service is quite flexible. It can recognize a large number of types of textual information, including recipes, program code, lyrics, etc. He also knows how to identify the heroes of various literary works, including "Harry Potter" and The Lord of the Rings.

    At the same time, the possibilities of the system are limited - it does not know how to “think” on a large scale, but acts superficially. The texts that AI writes can have storylines, heroes (if it's a story). But all this is not logically connected, that is, the heroes appear and disappear, and their actions are random.

    Dialogues are built on the same random principle. If the dialogue is more or less harmonious, then this is more likely a case than the capabilities of the service. However, the simpler AI forms quite well. Work is being done at the expense of network sources and other places.

    Earlier on Habré it was reported that GPT-2 is trained on ordinary Internet pages (about 8 million sites, 40 GB of text). The selection of training sources included sites that have a good rating on reddit - this was done to prevent spamming and advertising resources from clogging the data source.

    When forming a dialogue, you need to submit the beginning of the phrase. For example, "Mars is ...", after which the system supplements the sentence. The network can give answers without special additional training for a specific task.



    Also popular now: