Objective

Calculate the intercoder reliability index (K) for the statement and concepts:

  1. Statement

  2. Concepts

  3. person

  4. organization

  5. country

  6. attitude toward nuclear

Procedure

After coding the data in DNA we exported the statements and compare the results among coders 1 & 2.

The process of coding started using two coders working simulatenously, of the whole dataset. Coders followed a Codebook developed in advance and had regular meetings to compare results. They worked independently a first round and afterward had meetings to discuss the results.Initia intercoder reliability indexes are presented below.

Round 1 by both coders

At this stage, both coders worked independently for years 2010-2013

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 100.00000 1.0000000
organization 100.00000 1.0000000
concept 69.35484 0.6598571
agreement 99.20000 0.9719164
category 100.00000 1.0000000
ideology 100.00000 1.0000000
nuclear 100.00000 1.0000000
country 100.00000 1.0000000
issue 100.00000 1.0000000

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.959086"

2nd Round, after revision by both coders

In this round, metrics are calculated after a meeting between coder to discuss and review results for years 2010-2013.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 100.00000 1.0000000
organization 100.00000 1.0000000
concept 92.62295 0.9172632
agreement 99.18699 0.9695318
category 100.00000 1.0000000
ideology 100.00000 1.0000000
nuclear 100.00000 1.0000000
country 100.00000 1.0000000
issue 100.00000 1.0000000

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.987422"

3rd Round, after revision by both coders

To the previous results that were discussed, we add new results for years 2013-2014 worked independently.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 97.87686 0.9784833
organization 96.38298 0.9627959
concept 95.76271 0.9520712
agreement 98.73684 0.9672602
category 96.17021 0.9517743
ideology 97.01493 0.9466937
nuclear 99.57895 0.9908391
country 99.78769 0.9400992
issue 99.57537 0.9935527

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.964841"

4td Round, after revision by both coders

In this round all results were from 2010 to 2015 were added and reviewed by both coders. A meeting to discuss differenced helped to increase the agreements.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 98.29457 0.9827569
organization 97.20497 0.9712951
concept 95.82043 0.9528691
agreement 99.07550 0.9746844
category 97.04969 0.9628177
ideology 97.66355 0.9583398
nuclear 99.69183 0.9929948
country 99.84496 0.9465839
issue 99.53488 0.9926420

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.970554"

5th Round, after revision by both coders

In this round all results were from 2010 to 2018 were added and reviewed by both coders. A meeting to discuss differences helped to increase the agreements.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 98.36735 0.9835524
organization 97.71429 0.9767328
concept 96.98206 0.9653401
agreement 99.07550 0.9746844
category 97.04969 0.9628177
ideology 97.66355 0.9583398
nuclear 99.69183 0.9929948
country 99.84496 0.9465839
issue 99.53488 0.9926420

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.972632"

6th Round, after revision by both coders

In this round we added 2019. So the calculations involved all results were from 2010 to 2019 were added and reviewed by both coders. A meeting to discuss differences helped to increase the agreements.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 97.65594 0.9764133
organization 96.98053 0.9694571
concept 91.77919 0.9074361
agreement 98.29281 0.9641304
category 97.52066 0.9695867
ideology 98.70734 0.9782983
nuclear 99.89653 0.9947238
country 99.48374 0.6855213
issue 98.81198 0.9802836

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.936206"

7th Round, after revision by both coders

Year 2020 was added. So the new calculations involved all results were from 2010 to 2020 were added and reviewed by both coders. A meeting to discuss differences helped to increase the agreements.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 97.81659 0.9780450
organization 97.31273 0.9728379
concept 91.87101 0.9077641
agreement 98.32846 0.9628299
category 97.82972 0.9732499
ideology 98.87218 0.9805292
nuclear 99.91642 0.9948465
country 99.58281 0.7205707
issue 98.99833 0.9815723

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.941361"

8th Round, after revision by both coders

Year 2021 was added. So the new calculations involved all results were from 2010 to 2021 were added and reviewed by both coders. A meeting to discuss differences helped to increase the agreements.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 98.28776 0.9828093
organization 97.95553 0.9793372
concept 92.53579 0.9165205
agreement 98.55813 0.9630348
category 98.43938 0.9808510
ideology 99.18919 0.9857483
nuclear 99.93992 0.9949842
country 99.69997 0.7606738
issue 99.24970 0.9853007

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.949918"

9th Round, after revision by both coders

Year 2022 was added. Review by both coders was not conducted yet.

Results of initial inter-coder statements
item agreement Cohen’s kappa
person 98.76429 0.9876049
organization 98.50609 0.9848986
concept 93.36659 0.9265361
agreement 98.82304 0.9632173
category 98.92562 0.9867648
ideology 99.44192 0.9900450
nuclear 99.62833 0.9573322
country 99.79343 0.8740615
issue 99.48347 0.9894443

Average to be reported

In sum, the values to report Cohen K are the following:

## [1] "Average of Cohen's Kappa for the sample of the whole dataset: 0.962212"