TabCAT-EXAMINER Correlations

## Presents Correlations between the full TabCAT-EXAMINER and models where a set of items was deleted. Each graph shows the associations across models that had a specific number of items. Each column shows models where that designated test was included. For example, column “animals_r” presents associations for all models that included the animals variable.

## Warning: Removed 8 rows containing non-finite outside the scale range
## (`stat_boxplot()`).
## Warning: Removed 8 rows containing missing values or values outside the scale range
## (`geom_point()`).
## Warning: Removed 8 rows containing non-finite outside the scale range
## (`stat_boxplot()`).
## Warning: Removed 8 rows containing missing values or values outside the scale range
## (`geom_point()`).

Best performing models

Presents the best performing models by number of items inlcuded. The best model was defined by the most positive association with the full TabCAT-EXAMINER. Models were grouped by number of components then the top 7 were selected.

Top 5 Corr per Number of Items
items.Retained Xe.Retained executivecomposite_r.Corr executivecomposite_r.LCI executivecomposite_r.UCI executivecomposite_r.P
2 fwords_r,match_r 0.945 0.940 0.950 0
2 animals_r,match_r 0.942 0.937 0.947 0
2 lwords_r,match_r 0.942 0.936 0.947 0
2 match_r,veg_r 0.934 0.927 0.940 0
2 match_r,setshift_r 0.932 0.926 0.938 0
2 flanker_r,match_r 0.931 0.924 0.937 0
2 match_r,rundots_r 0.900 0.891 0.909 0
3 flanker_r,lwords_r,match_r 0.934 0.927 0.940 0
3 flanker_r,fwords_r,match_r 0.933 0.927 0.939 0
3 animals_r,match_r,setshift_r 0.931 0.925 0.937 0
3 lwords_r,match_r,setshift_r 0.930 0.923 0.936 0
3 animals_r,flanker_r,match_r 0.930 0.923 0.936 0
3 fwords_r,match_r,setshift_r 0.930 0.924 0.936 0
3 match_r,setshift_r,veg_r 0.929 0.923 0.935 0
4 flanker_r,fwords_r,lwords_r,match_r 0.922 0.914 0.928 0
4 animals_r,flanker_r,fwords_r,setshift_r 0.921 0.910 0.931 0
4 flanker_r,lwords_r,match_r,setshift_r 0.921 0.914 0.928 0
4 animals_r,flanker_r,match_r,setshift_r 0.920 0.912 0.927 0
4 flanker_r,match_r,setshift_r,veg_r 0.920 0.913 0.927 0
4 flanker_r,fwords_r,match_r,setshift_r 0.920 0.913 0.927 0
4 dotcount_r,flanker_r,fwords_r,setshift_r 0.918 0.906 0.928 0
5 animals_r,dotcount_r,flanker_r,fwords_r,setshift_r 0.927 0.917 0.936 0
5 animals_r,dotcount_r,flanker_r,lwords_r,setshift_r 0.924 0.913 0.933 0
5 dotcount_r,flanker_r,fwords_r,setshift_r,veg_r 0.923 0.913 0.933 0
5 dotcount_r,flanker_r,fwords_r,lwords_r,setshift_r 0.922 0.911 0.932 0
5 animals_r,dotcount_r,flanker_r,setshift_r,veg_r 0.920 0.909 0.930 0
5 animals_r,dotcount_r,flanker_r,match_r,setshift_r 0.919 0.911 0.926 0
5 dotcount_r,flanker_r,match_r,setshift_r,veg_r 0.918 0.910 0.925 0
6 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,setshift_r 0.923 0.912 0.932 0
6 animals_r,dotcount_r,flanker_r,fwords_r,setshift_r,veg_r 0.920 0.909 0.930 0
6 dotcount_r,flanker_r,fwords_r,lwords_r,setshift_r,veg_r 0.920 0.908 0.929 0
6 animals_r,dotcount_r,flanker_r,fwords_r,rundots_r,setshift_r 0.920 0.909 0.929 0
6 animals_r,dotcount_r,flanker_r,lwords_r,rundots_r,setshift_r 0.916 0.905 0.926 0
6 animals_r,dotcount_r,flanker_r,lwords_r,setshift_r,veg_r 0.916 0.905 0.927 0
6 dotcount_r,flanker_r,fwords_r,rundots_r,setshift_r,veg_r 0.915 0.904 0.925 0
7 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,setshift_r,veg_r 0.918 0.907 0.928 0
7 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,rundots_r,setshift_r 0.918 0.907 0.927 0
7 animals_r,dotcount_r,flanker_r,fwords_r,rundots_r,setshift_r,veg_r 0.916 0.905 0.926 0
7 dotcount_r,flanker_r,fwords_r,lwords_r,rundots_r,setshift_r,veg_r 0.914 0.902 0.924 0
7 animals_r,dotcount_r,flanker_r,lwords_r,rundots_r,setshift_r,veg_r 0.913 0.901 0.923 0
7 animals_r,dotcount_r,flanker_r,lwords_r,match_r,rundots_r,setshift_r 0.909 0.901 0.917 0
7 animals_r,flanker_r,fwords_r,lwords_r,rundots_r,setshift_r,veg_r 0.909 0.896 0.919 0
8 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,rundots_r,setshift_r,veg_r 0.914 0.903 0.924 0
8 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,match_r,rundots_r,setshift_r 0.902 0.893 0.911 0
8 dotcount_r,flanker_r,fwords_r,lwords_r,match_r,rundots_r,setshift_r,veg_r 0.902 0.893 0.910 0
8 animals_r,dotcount_r,flanker_r,lwords_r,match_r,rundots_r,setshift_r,veg_r 0.900 0.891 0.908 0
8 animals_r,dotcount_r,flanker_r,fwords_r,match_r,rundots_r,setshift_r,veg_r 0.900 0.891 0.909 0
8 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,match_r,setshift_r,veg_r 0.896 0.887 0.905 0
8 animals_r,flanker_r,fwords_r,lwords_r,match_r,rundots_r,setshift_r,veg_r 0.895 0.886 0.904 0
NA NA NA NA NA NA

#————————————–#

FTLD CDR Correlations

## Presents Correlations between the FTLD CDR Sum of Boxes and models where a set of items was deleted. Each graph shows the associations across models that had a specific number of items. Each column shows models where that designated test was included. For example, column “animals_r” presents associations for all models that included the animals variable.

## NOTE: The association was reverse scored so that more positive associations indicate stronger associations.

## Warning: Removed 5 rows containing non-finite outside the scale range
## (`stat_boxplot()`).
## Warning: Removed 5 rows containing missing values or values outside the scale range
## (`geom_point()`).

## Warning: Removed 5 rows containing non-finite outside the scale range
## (`stat_boxplot()`).
## Removed 5 rows containing missing values or values outside the scale range
## (`geom_point()`).

Best performing models FTLD CDR SOB

Presents the best performing models by number of items inlcuded. The best model was defined by the most negative association with the FTLD CDR SOB. Models were grouped by number of components then the top 7 were selected.

Top 5 Corr per Number of Items
items.Retained Xe.Retained ftldcdr_sob.Corr ftldcdr_sob.LCI ftldcdr_sob.UCI ftldcdr_sob.P
2 animals_r,veg_r -0.670 -0.718 -0.616 0
2 animals_r,fwords_r -0.656 -0.705 -0.600 0
2 fwords_r,veg_r -0.655 -0.705 -0.600 0
2 animals_r,lwords_r -0.651 -0.700 -0.595 0
2 lwords_r,veg_r -0.645 -0.696 -0.588 0
2 match_r,veg_r -0.637 -0.667 -0.605 0
2 animals_r,match_r -0.633 -0.664 -0.600 0
3 animals_r,fwords_r,veg_r -0.676 -0.723 -0.624 0
3 animals_r,lwords_r,veg_r -0.673 -0.720 -0.620 0
3 animals_r,dotcount_r,veg_r -0.665 -0.708 -0.618 0
3 animals_r,setshift_r,veg_r -0.659 -0.702 -0.611 0
3 fwords_r,lwords_r,veg_r -0.653 -0.702 -0.597 0
3 animals_r,fwords_r,lwords_r -0.653 -0.702 -0.597 0
3 animals_r,flanker_r,veg_r -0.652 -0.695 -0.604 0
4 animals_r,dotcount_r,setshift_r,veg_r -0.680 -0.721 -0.635 0
4 animals_r,dotcount_r,flanker_r,veg_r -0.673 -0.713 -0.628 0
4 animals_r,dotcount_r,fwords_r,veg_r -0.668 -0.710 -0.622 0
4 animals_r,fwords_r,lwords_r,veg_r -0.667 -0.715 -0.613 0
4 animals_r,dotcount_r,lwords_r,veg_r -0.666 -0.708 -0.619 0
4 dotcount_r,fwords_r,setshift_r,veg_r -0.665 -0.707 -0.619 0
4 dotcount_r,lwords_r,setshift_r,veg_r -0.664 -0.706 -0.618 0
5 animals_r,dotcount_r,fwords_r,setshift_r,veg_r -0.681 -0.721 -0.636 0
5 animals_r,dotcount_r,lwords_r,setshift_r,veg_r -0.680 -0.721 -0.636 0
5 animals_r,dotcount_r,flanker_r,lwords_r,veg_r -0.675 -0.715 -0.631 0
5 animals_r,dotcount_r,flanker_r,fwords_r,veg_r -0.675 -0.715 -0.631 0
5 animals_r,dotcount_r,flanker_r,setshift_r,veg_r -0.670 -0.711 -0.626 0
5 dotcount_r,fwords_r,lwords_r,setshift_r,veg_r -0.666 -0.708 -0.619 0
5 dotcount_r,flanker_r,fwords_r,lwords_r,veg_r -0.663 -0.704 -0.618 0
6 animals_r,dotcount_r,flanker_r,lwords_r,setshift_r,veg_r -0.675 -0.715 -0.631 0
6 animals_r,dotcount_r,flanker_r,fwords_r,setshift_r,veg_r -0.674 -0.714 -0.630 0
6 animals_r,dotcount_r,fwords_r,lwords_r,setshift_r,veg_r -0.674 -0.715 -0.628 0
6 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,veg_r -0.670 -0.710 -0.625 0
6 dotcount_r,flanker_r,fwords_r,lwords_r,setshift_r,veg_r -0.663 -0.704 -0.618 0
6 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,setshift_r -0.659 -0.700 -0.613 0
6 animals_r,flanker_r,fwords_r,lwords_r,setshift_r,veg_r -0.651 -0.693 -0.604 0
7 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,setshift_r,veg_r -0.670 -0.711 -0.626 0
7 animals_r,dotcount_r,flanker_r,lwords_r,rundots_r,setshift_r,veg_r -0.642 -0.683 -0.597 0
7 animals_r,dotcount_r,flanker_r,fwords_r,rundots_r,setshift_r,veg_r -0.641 -0.683 -0.596 0
7 animals_r,dotcount_r,fwords_r,lwords_r,rundots_r,setshift_r,veg_r -0.641 -0.683 -0.594 0
7 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,rundots_r,veg_r -0.631 -0.674 -0.585 0
7 dotcount_r,flanker_r,fwords_r,lwords_r,rundots_r,setshift_r,veg_r -0.629 -0.672 -0.583 0
7 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,rundots_r,setshift_r -0.626 -0.669 -0.579 0
8 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,rundots_r,setshift_r,veg_r -0.640 -0.681 -0.594 0
8 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,match_r,setshift_r,veg_r -0.610 -0.642 -0.576 0
8 animals_r,dotcount_r,flanker_r,lwords_r,match_r,rundots_r,setshift_r,veg_r -0.601 -0.633 -0.567 0
8 animals_r,dotcount_r,flanker_r,fwords_r,lwords_r,match_r,rundots_r,veg_r -0.600 -0.632 -0.566 0
8 animals_r,dotcount_r,flanker_r,fwords_r,match_r,rundots_r,setshift_r,veg_r -0.600 -0.632 -0.565 0
8 animals_r,dotcount_r,fwords_r,lwords_r,match_r,rundots_r,setshift_r,veg_r -0.597 -0.630 -0.563 0
8 dotcount_r,flanker_r,fwords_r,lwords_r,match_r,rundots_r,setshift_r,veg_r -0.595 -0.628 -0.560 0
NA NA NA NA NA NA