Use this cutting-edge AI text generator to write stories, poems, news articles, and more

Even the most advanced chatbots can not maintain a decent conversation, but AI systems are definitely improving in the generation of the written word. A new web application provides extensive testing, allowing anyone to enter a text message to which the AI ​​software will automatically respond.

Enter the start of an invented news article and it will end with you. Ask him a question (when formatting his entry like this: "Q: What should I do today?"), And he will respond happily.

The site is called TalkToTransformer.com and is the brainchild of Canadian engineer Adam King. King created the site, but the underlying technology comes from the OpenAI research laboratory. Earlier this year, OpenAI unveiled its new AI language system, GPT-2, and TalkToTransformer is a reduced and accessible version of that same technology, which has been accessible only to scientists and journalists selected in the past. (The name "transformer" refers to the type of neural network used by GPT-2 and other systems).

If you want more information on the generation of AI language, there is no better way to understand its enormous potential and serious limitations than by playing with TalkToTransformer.

On the positive side, the model is incredibly flexible. It is able to recognize a wide variety of entries, from news articles and stories to song lyrics, poems, recipes, codes and HTML. You can even identify familiar franchise characters such as Harry Potter and The Lord of the Rings .

At the same time, you will soon see that, on a fundamental level, the system does not understand the language or the world in general. The text it generates has coherence at the surface level but not a long-term structure. When writing stories, for example, the characters appear and disappear at random, without coherence in their needs or actions. When a dialogue is generated, conversations drift aimlessly from one topic to another. If you get more than a few answers, it looks like good luck, not skill.

Even so, as The Verge explained in our original coverage of GPT-2, this system is hugely impressive. Remember: this is a unique algorithm that has learned to generate text by studying a huge data set drawn from the web and other sources. He learned by looking for patterns in this information, and the result is a surprisingly multifaceted system.

It may not be difficult to find gaps in the knowledge of GPT-2, but it is impossible to know if you have explored the limits of what can be done

Please Note: This content is provided and hosted by a 3rd party server. Sometimes these servers may include advertisements. igetintopc.com does not host or upload this material and is not responsible for the content.