Autoregressive Transformers
Large Language Models are misnamed. In actuality, they can model and predict token streams of any kind, be it text, audio, images or in the case of this wonderful paper by Vongani Hlavutelo Maluleke, the next move of a dancer based on both their prior moves and those of their partner...
Andrej Karpathy suggests a better name would be 'autoregressive transformers', that is token generation dependent on the previous tokens (autoregressive), based on an assessment of the importance of each token in the input (transformers).
At Featuring You we use human features and outcomes as our tokens, training a model to predict how outcomes might change given changes in those features that can be changed, when constrained by those that we cannot. Our mission is personal optimisation, not personal development: leveraging your natural advantages to achieve the maximum output, be it success, fulfillment or happiness.