ResearchResearcher

Syntactic Arrow of Time

Forward-trained LLMs outperform backward ones. Is the asymmetry from grammar or meaning? Tested by training on POS tags only.

t-1tt+1

Highlights

  • POS-sequence GPT variants trained forward and backward
  • Cross-lingual evaluation on English, German, Japanese
  • Finding: grammar alone shows no arrow of time

Motivation

Research project at EPFL investigating temporal asymmetry in language models.

Tech Stack

PythonPyTorch