Paranoid Transformer: Reading Narrative of Madness as Computational Approach to Creativity

07/13/2020
by   Yana Agafonova, et al.
0

This papers revisits the receptive theory in context of computational creativity. It presents a case study of a Paranoid Transformer - a fully autonomous text generation engine with raw output that could be read as the narrative of a mad digital persona without any additional human post-filtering. We describe technical details of the generative system, provide examples of output and discuss the impact of receptive theory, chance discovery and simulation of fringe mental state on the understanding of computational creativity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2021

TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models

Text recognition is a long-standing research problem for document digita...
research
12/02/2019

Neural Academic Paper Generation

In this work, we tackle the problem of structured text generation, speci...
research
01/23/2017

Normative theory of visual receptive fields

This article gives an overview of a normative computational theory of vi...
research
10/02/2012

Invariance of visual operations at the level of receptive fields

Receptive field profiles registered by cell recordings have shown that m...
research
12/20/2022

Receptive Field Alignment Enables Transformer Length Extrapolation

Length extrapolation is a desirable property that permits training a tra...
research
09/22/2022

DFX: A Low-latency Multi-FPGA Appliance for Accelerating Transformer-based Text Generation

Transformer is a deep learning language model widely used for natural la...

Please sign up or login with your details

Forgot password? Click here to reset