Social Foundations of Computation

Do Personality Tests Generalize to Large Language Models

2023

Poster

sf


With large language models (LLMs) appearing to behave increasingly human-like in text-based interactions, it has become popular to attempt to evaluate various properties of these models using tests originally designed for humans. While re-using existing tests is a resource-efficient way to evaluate LLMs, careful adjustments are usually required to ensure that test results are even valid across human sub-populations. Thus, it is not clear to what extent different tests’ validity generalizes to LLMs. In this work, we provide evidence that LLMs’ responses to personality tests systematically deviate from typical human responses, implying that these results cannot be interpreted in the same way as human test results. Concretely, reverse-coded items (e.g. “I am introverted” vs “I am extraverted”) are often both answered affirmatively by LLMs. In addition, variation across different prompts designed to “steer” LLMs to simulate particular personality types does not follow the clear separation into five independent personality factors from human samples. In light of these results, we believe it is important to pay more attention to tests’ validity for LLMs before drawing strong conclusions about potentially ill-defined concepts like LLMs’ “personality”.

Author(s): Dorner, Florian E.* and Sühr, Tom* and Samadi, Samira and Kelava, Augustin
Book Title: Socially Responsible Language Modelling Research (SoLaR) Workshop, NeurIPS 2023
Year: 2023
Month: October

Department(s): Social Foundations of Computation
Bibtex Type: Poster (poster)

Note: *equal contribution
State: Published
URL: https://openreview.net/pdf?id=zKDSfGhCoK

BibTex

@poster{dorner2023personality,
  title = {Do Personality Tests Generalize to Large Language Models},
  author = {Dorner, Florian E.* and S{\"u}hr, Tom* and Samadi, Samira and Kelava, Augustin},
  booktitle = {Socially Responsible Language Modelling Research (SoLaR) Workshop, NeurIPS 2023},
  month = oct,
  year = {2023},
  note = {*equal contribution},
  doi = {},
  url = {https://openreview.net/pdf?id=zKDSfGhCoK},
  month_numeric = {10}
}