PRE
PRE
Let’s start by visualizing the differences between the ideal points of the CCES respondents and the ideal points of the model respondents. Zero indicates perfect agreement between the two sets of ideal points. Positive values indicate that the model respondents are more conservative than the CCES respondents, while negative values indicate that the model respondents are more liberal. If partisans are consistently more extreme in the model than in real life, we can say the models are polarizing.
Overall
Model | N | Mean Sq Dev | Mean Abs Diff | Corr with CCES |
---|---|---|---|---|
llama3.1:8b | 5000 | 0.423 | 0.515 | 0.792 |
llama3.2:3b | 4992 | 0.528 | 0.58 | 0.764 |
tulu3:8b | 5000 | 0.554 | 0.593 | 0.775 |
mistral-nemo:12b | 5000 | 0.692 | 0.692 | 0.778 |
qwen2.5:7b | 5000 | 0.717 | 0.73 | 0.773 |
gemma2:9b | 4999 | 0.888 | 0.826 | 0.77 |
qwen2.5:14b | 618 | 0.893 | 0.815 | 0.722 |
phi3:14b | 241 | 1.182 | 0.899 | 0.562 |
Republicans
Model | N | Mean Sq Dev | Mean Abs Diff | Corr with CCES |
---|---|---|---|---|
llama3.2:3b | 1415 | 0.295 | 0.425 | 0.377 |
mistral-nemo:12b | 1418 | 0.307 | 0.453 | 0.443 |
llama3.1:8b | 1418 | 0.4 | 0.502 | 0.369 |
tulu3:8b | 1418 | 0.463 | 0.548 | 0.419 |
qwen2.5:7b | 1418 | 0.872 | 0.849 | 0.345 |
phi3:14b | 66 | 0.981 | 0.87 | 0.457 |
qwen2.5:14b | 171 | 1.095 | 0.968 | 0.421 |
gemma2:9b | 1418 | 1.531 | 1.168 | 0.266 |
Model | N | Mean Sq Dev | Mean Abs Diff | Corr with CCES |
---|---|---|---|---|
llama3.1:8b | 2088 | 0.441 | 0.525 | 0.444 |
tulu3:8b | 2088 | 0.6 | 0.617 | 0.379 |
llama3.2:3b | 2085 | 0.628 | 0.651 | 0.286 |
gemma2:9b | 2088 | 0.695 | 0.745 | 0.43 |
qwen2.5:7b | 2088 | 0.802 | 0.791 | 0.43 |
qwen2.5:14b | 251 | 1.008 | 0.875 | 0.357 |
mistral-nemo:12b | 2088 | 1.011 | 0.899 | 0.423 |
phi3:14b | 93 | 1.717 | 1.105 | 0.205 |
Model | N | Mean Sq Dev | Mean Abs Diff | Corr with CCES |
---|---|---|---|---|
llama3.1:8b | 1494 | 0.42 | 0.512 | 0.717 |
qwen2.5:7b | 1494 | 0.453 | 0.532 | 0.675 |
gemma2:9b | 1493 | 0.547 | 0.614 | 0.708 |
qwen2.5:14b | 196 | 0.569 | 0.606 | 0.632 |
tulu3:8b | 1494 | 0.575 | 0.601 | 0.676 |
llama3.2:3b | 1492 | 0.61 | 0.626 | 0.682 |
mistral-nemo:12b | 1494 | 0.611 | 0.629 | 0.695 |
phi3:14b | 82 | 0.736 | 0.687 | 0.455 |