Religious Hostilities Reach Six-Year High
Appendix 1: Methodology
This is the fifth time the Pew Research Center has measured restrictions on religion around the globe.1 This report, which includes data for the year ending Dec. 31, 2012, follows the same methodology as previous reports.
Pew Research uses two 10-point indexes – the Government Restrictions Index (GRI) and the Social Hostilities Index (SHI) – to rate 198 countries and self-governing territories on their levels of restrictions.2 This report analyzes changes in restrictions on an annual basis, looking at six years ending mid-2007, mid-2008, mid-2009, mid-2010, Dec. 31, 2011, and Dec. 31, 2012. It categorizes the amount of change in each country’s scores in two ways, numerically and by percentile.
First, countries are grouped into categories depending on the size of the numeric change in their scores from year to year on the two indexes: changes of two points or more in either direction; changes of at least one point but less than two points; changes of less than one point; or no change at all. (See chart at right and charts in Changes in Social Hostilities, Changes in Government Restrictions and Changes in Overall Restrictions sections of the report.)
Changes in overall levels of restrictions are calculated for each country by comparing its scores on both indexes (the GRI and the SHI) from year to year. When a country’s scores on the GRI and the SHI changed in the same direction (both increased or both decreased), the greater amount of change determined the category. For instance, if the country’s GRI score increased by 0.8 and its SHI score increased by 1.5, the country was put into the overall “1.0-1.9 increase” category. When a country’s score increased on one index but decreased on the other, the difference between the amounts of change determined the grouping. For example, if the country’s GRI score increased by 2.0 and its SHI score decreased by 1.5, the country went into the overall “0.1-0.9 increase” category. When a country’s score on one index stayed the same, the amount of change on the other index was used to assign the category.
Second, this report categorizes the levels of government restrictions and social hostilities in each country by percentiles. As the benchmark, it uses the results from the baseline year (the year ending in mid-2007). Scores in the top 5% on each index in mid-2007 were categorized as “very high.” The next highest 15% of scores were categorized as “high,” and the following 20% were categorized as “moderate.” The bottom 60% of scores were categorized as “low.” See the table to the right for the index score thresholds as determined from the mid-2007 data. These thresholds are applied to all subsequent years of data.
Overview of Procedures
The methodology used by Pew Research to assess and compare restrictions on religion was developed by senior researcher and director of cross-national data Brian J. Grim in consultation with other members of the Pew Research Center staff, building on a methodology that Grim and Professor Roger Finke developed while at Penn State University’s Association of Religion Data Archives.3 The goal was to devise quantifiable, objective and transparent measures of the extent to which governments and societal groups impinge on the practice of religion. The findings were used to rate countries and self-governing territories on two indexes that are reproducible and can be periodically updated.
This research goes beyond previous efforts to assess restrictions on religion in several ways. First, Pew Research coded (categorized and counted) data from 18 published cross-national sources, providing a high degree of confidence in the findings. The Pew Research coders looked to the sources for only specific, well-documented facts, not opinions or commentary.
Second, Pew Research staff used extensive data-verification checks that reflect generally accepted best practices for such studies, such as double-blind coding (coders do not see each other’s ratings), inter-rater reliability assessments (checking for consistency among coders) and carefully monitored protocols to reconcile discrepancies among coders.
Third, the Pew Research coding took into account whether the perpetrators of religion-related violence were government or private actors. The coding also identified how widespread and intensive the restrictions were in each country.
Fourth, one of the most valuable contributions of the indexes and the questions used to construct them (see the section on “The Coding Instrument”) is their ability to chart change over time.
Countries and Territories
The 198 countries and self-administering territories covered by the study contain more than 99.5% of the world’s population. They include 192 of the 193 member states of the United Nations as of 2012 plus six self-administering territories – Kosovo, Hong Kong, Macau, the Palestinian territories, Taiwan and Western Sahara.4 Reporting on these territories does not imply any position on what their international political status should be, only recognition that the de facto situations in these territories require separate analysis.
Although the 198 countries and territories vary widely in size, population, wealth, ethnic diversity, religious makeup and form of government, the study does not attempt to adjust for such differences. Poor countries are not scored differently on the indexes than wealthy ones. Countries with diverse ethnic and religious populations are not “expected” to have more social hostilities than countries with more homogeneous populations. And democracies are not assessed more leniently or harshly than authoritarian regimes.
Pew Research identified 18 widely available, frequently cited sources of information on government restrictions and social hostilities involving religion around the world. This study includes three sources that were not used in the baseline report on religious restrictions. (See the “Primary Sources” section for more details on the new information sources.)
The primary sources, which are listed below, include reports from U.S. government agencies, several independent, nongovernmental organizations and a variety of European and United Nations bodies. Although most of these organizations are based in Western countries, many of them depend on local staff to collect information across the globe. As previously noted, Pew Research did not use the commentaries, opinions or normative judgments of the sources; the sources were combed only for factual information on specific policies and actions.
1. Country constitutions
2. U.S. State Department annual reports on International Religious Freedom
3. U.S. Commission on International Religious Freedom annual reports
4. U.N. Special Rapporteur on Freedom of Religion or Belief reports
5. Human Rights First reports in first and second years of coding; Freedom House reports in third, fourth and fifth years of coding
6. Hudson Institute publication: “Religious Freedom in the World” (Paul Marshall)
7. Human Rights Watch topical reports
8. International Crisis Group country reports
9. United Kingdom Foreign & Commonwealth Office annual report on human rights
10. Council of the European Union annual report on human rights
11. Amnesty International reports
12. European Network Against Racism Shadow Reports
13. United Nations High Commissioner for Refugees reports
14. U.S. State Department annual Country Reports on Terrorism
15. Anti-Defamation League reports
16. U.S. State Department Country Reports on Human Rights Practices
17. Uppsala University’s Uppsala Conflict Data Program, Armed Conflict Database
18. Human Rights Without Frontiers “Freedom of Religion or Belief” newsletters
U.S. government reports with information on the situation in the United States
- U.S. Department of Justice “Religious Freedom in Focus” newsletters and reports
- FBI Hate Crime Reports
As noted, this study includes three sources that were not included in the Pew Research Center’s first report on global restrictions on religion: Freedom House reports; Uppsala University’s Armed Conflict Database; and the “Freedom of Religion or Belief” newsletters of Human Rights Without Frontiers.
The Freedom House reports have replaced Human Rights First reports, which have not been updated since mid-2008. The Uppsala Armed Conflict Database provides information on the number of people affected by religion-related armed conflicts, supplementing other sources. The Human Rights Without Frontiers “Freedom of Religion or Belief” newsletters have partially replaced the Hudson Institute publication, “Religious Freedom in the World” (by Paul Marshall), which has not been updated since its release in 2008. Human Rights Without Frontiers is a nongovernmental organization based in Brussels, with affiliated offices throughout the world. The Hudson Institute publication still offers useful background on certain standing laws but no longer provides information on new or changing restrictions.
In previous years, there were 19 sources used in the coding. However, in the most recent year of the study, the U.S. government’s Worldwide Incident Tracking System (WITS) became unavailable online. Despite the absence of the source, three of the remaining 18 sources have given a comprehensive account of the presence of religion-related terrorism: the International Crisis Group’s country reports, Uppsala University’s Uppsala Conflict Data Program Armed Conflict Database and the State Department’s annual Country Reports on Terrorism.
While some of the increases in religious restrictions noted in this study could reflect the use of more up-to-date and/or better information sources, Pew Research staff monitor the impact of source information variability each year and have found no evidence of overall informational bias. (For additional discussion, see the “Potential Biases.”)
The Coding Instrument
As explained in more detail below, Pew Research staff developed a battery of questions similar to a survey questionnaire. Coders consulted the primary sources in order to answer the questions separately for each country. While the State Department’s annual reports on International Religious Freedom generally contained the most comprehensive information, the other sources provided additional factual detail that was used to settle ambiguities, resolve contradictions and help in the proper scoring of each question.
The questionnaire, or coding instrument, generated a set of numerical measures on restrictions in each country. It also made it possible to see how government restrictions intersect with broader social tensions and incidents of violence or intimidation by private actors. The coding instrument with the list of questions used for this report is shown in the Summary of Results (PDF).
The coding process required the coders to check all the sources for each country. Coders determined whether each source provided information critical to assigning a score; had supporting information but did not result in new facts; or had no available information on that particular country. Multiple sources of information were available for all countries and self-administering territories with populations greater than 1 million. More than three-in-four of the countries and territories analyzed by the Pew Research Center were multi-sourced; only small, predominantly island, countries had a single source, namely, the State Department reports.
Coding the United States presented a special problem since it is not included in the State Department’s annual reports on International Religious Freedom. Accordingly, Pew Research coders also looked at reports from the U.S. Department of Justice and the FBI on violations of religious freedom in the United States, in addition to consulting all the primary sources, including reports by the United Nations, Human Rights Watch, Freedom House, the International Crisis Group and the U.K. Foreign & Commonwealth Office, many of which contain data on the United States.
The Coding Process
The Pew Research Center employed strict training and rigorous coding protocols to make its coding as objective and reproducible as possible. Coders worked directly under a senior researcher’s supervision, with additional direction and support provided by other Pew Research Center researchers. The coders underwent an intensive training period that included a thorough overview of the research objectives, information sources and methodology.
Countries were double-blind coded by two coders (coders did not see each other’s ratings), and the initial ratings were entered into an electronic document (coding instrument) including details on each incident. The coders began by filling out the coding instrument for each country using the information source that had the most comprehensive information, typically the State Department reports. The protocol for each coder was to answer every question on which information was available in the initial source. Once a coder had completed that process, he or she then turned to the other sources. As new information was found, this was also coded and the source duly noted. Whenever ambiguities or contradictions arose, the source providing the most detailed, clearly documented evidence was used.
After two coders had separately completed the coding instrument for a particular country, their scores were compared by a senior researcher. Areas of discrepancy were discussed at length with the coders and were reconciled in order to arrive at a single score on each question for each country. The data for each country were then combined into a master file, and the answers and substantiating evidence were entered into a database.
Throughout this process, the coding instrument itself was continually monitored for possible defects. The questions were designed to be precise, comprehensive and objective so that, based on the same data and definitions, the coding could be reliably reproduced by others with the same results.
Pew Research staff generally found few cases in which one source contradicted another. When contradictions did arise – such as when sources provided differing estimates of the number of people displaced due to religion-related violence – the source that cited the most specific documentation was used. The coders were instructed to disregard broad, unsubstantiated generalizations regarding abuses and to focus on reports that contained clear, precise documentation and factual details, such as names, dates and places where incidents occurred.
The data-verification procedures went beyond the inter-rater reliability statistics. They also involved comparing the answers on the main measures for each country with other closely related questions in the data set. This provided a practical way to test the internal reliability of the data.
Pew Research staff also checked the reliability of the coded data by comparing them with similar, though more limited, religious restrictions data sets. In particular, published government and social regulation of religion index scores are available from the Association of Religion Data Archives (for three years of data) and the Hudson Institute (for one year of data), which makes them ideal measures for cross-validation. The review process found very few significant discrepancies in the coded data; changes were made only if warranted by a further review of the primary sources.
Restriction of Religion Indexes
The Government Restrictions Index is based on 20 indicators of ways that national and local governments restrict religion, including through coercion and force. The Social Hostilities Index is based on 13 indicators of ways in which private individuals and social groups infringe on religious beliefs and practices, including religiously biased crimes, mob violence and efforts to stop particular religious groups from growing or operating. The study also counted the number and types of documented incidents of religion-related violence, including terrorism and armed conflict.
Government Restrictions Index
Coding multiple indicators makes it possible to construct a Government Restrictions Index of sufficient gradation to allow for meaningful cross-national comparisons. An additional advantage of using multiple indicators is that it helps mitigate the effects of measurement error in any one variable, providing greater confidence in the overall measure.
The Pew Research Center coded 20 indicators of government restrictions on religion (see the Summary of Results [PDF]). These 20 items were added together to create the GRI. In two cases, these items represent an aggregation of several closely related questions: Measures of five types of physical abuses are combined into a single variable (GRI Q.19), and seven questions measuring aspects of government favoritism are combined into an overall favoritism scale (GRI Q.20 is a summary variable showing whether a country received the maximum score on one or more of the seven questions).
The GRI is a fine-grained measure created by adding the 20 items on a 0-to-10 metric, with zero indicating very low levels of government restrictions on religion and 10 indicating extremely high levels of restrictions. The 20 questions that form the GRI are coded in a standard scale from zero to one point, while gradations among the answers allowed for partial points to be given for lesser degrees of the particular government restriction being measured. The overall value of the index was calculated and proportionally adjusted – so that it had a maximum value of 10 and a possible range of zero to 10 – by dividing the sum of the variables by two.
Social Hostilities Index
In addition to government restrictions, violence and intimidation in societies also can limit religious beliefs and practices. Accordingly, Pew Research staff tracked more than a dozen indicators of social impediments on religion. Once again, coding multiple indicators made it possible to construct an index that shows gradations of severity or intensity and allows for comparisons among countries. The Summary of Results (PDF) contains the 13 items used by Pew Research staff to create the Social Hostilities Index.
The SHI was constructed by adding together the 13 indicators based on a 0-to-10 metric, with zero indicating very low impediments to religious beliefs and practices and 10 indicating extremely high impediments. The various questions that form the index are coded in a standard scale from zero to one point, while gradations among the answers allow for partial points to be given for lesser degrees of the particular hostilities being measured. The indicators were added together and set to have a possible range of zero to 10 by dividing the sum of the variables by 1.3.
Note on How Examples Are Coded
Examples of each type of government restriction or social hostility are generally counted in a single measure on the GRI or SHI. For instance, a restriction on proselytizing (sharing one’s faith with the intent of persuading another to join the faith) is not also counted as a restriction on conversion (an individual changing his/her religion). In some situations, however, an individual restriction or hostility may be part of a broader set of restrictions or hostilities. For instance, a mob attack by members of one religious group on an individual of another religion may be an isolated event and counted just under question SHI.Q.2: Was there mob violence related to religion? (See the Summary of Results [PDF].) However, if such an attack triggers repeated attacks between religious groups, it also might be an indication of sectarian or communal violence, which by definition involves two or more religious groups facing off in repeated clashes. In such a case, the mob attack also would be counted under question SHI.Q.3: Were there acts of sectarian or communal violence between religious groups? (See the Summary of Results [PDF].)
Note on the Effects of Consolidating to a New Database
For all six years of this study, information on the number, types and locations of incidents of government force and social violence toward religious groups as well as deference to religious authorities in matters of law were coded at the province level. (See “Example of Data Coding: India” (PDF) section of the December 2009 baseline report.) Each year, the province numbers were summed and put into separate country-level files. Since the publication of the August 2011 report, Pew Research staff have created a database that integrates all six years of province- and country-level data on religious restrictions. During this process, Pew Research staff reviewed any discrepancies between province files and the sums that had been transferred to the country files and made appropriate corrections. The adjustments made were relatively minor and had small effects on index scores for countries, on average less than 0.005 points on the 10-point indexes. Consolidating the six years of data into a database also entailed a review of the data on harassment of religious groups. In particular, instances of harassment from the year ending in mid-2007 were stored as open-ended questions, and in a few cases they were recoded to match the categories used in the subsequent years.
Note on Changing Time Period of Analysis
This new report looks at the extent and direction of change in government restrictions on religion and social hostilities involving religion during calendar year 2012. Where appropriate, it also compares the situation in 2012 with the situation in the baseline year of the study (mid-2006 to mid-2007).
This is the second time Pew Research has analyzed restrictions on religion in a calendar year. Previous reports analyzed 12-month periods from July 1-June 30 (e.g., July 1, 2009-June 30, 2010). The shift to calendar years was made, in part, because most of the primary sources used in this study are based on calendar years.
Because of the shift in time frame, this study does not report directly on incidents that occurred during the period from July 1-Dec. 31, 2010. While this misses some incidents that occurred during the second half of 2010, events that had an ongoing impact – such as a change to a country’s constitution or the outbreak of a religion-related war – were captured by the coding. Researchers for the study carefully reviewed the situation in each country and territory during this six-month period and made sure that restrictions with an ongoing impact were not overlooked.
Additional Analyses in the Study
As in the 2012 and 2013 reports, this study provides a summary of the number of countries where specific religious groups faced government or social harassment. This is essentially a cross-tabulation of GRI Q.11 (“Was there harassment or intimidation of religious groups by any level of government?”) and the first type of religious hatred or bias measured in SHI.Q.1.a. (“Did individuals face harassment or intimidation motivated by religious hatred or bias?”). For purposes of this study, the definition of harassment includes any mention in the primary sources of an offense against an individual or group based on religious identity. Such offenses may range from physical attacks and direct coercion to more subtle forms of discrimination. But prejudicial opinions or attitudes, in and of themselves, do not constitute harassment unless they are acted upon in a palpable way.
As noted above, this study provides data on the number of countries in which different religious groups are harassed or intimidated. But the study does not assess either the severity or the frequency of the harassment in each country. Therefore, the results should not be interpreted as gauging which religious group faces the most harassment or persecution around the world.
Religion-Related Terrorism and Armed Conflict
Terrorism and war can have huge direct and indirect effects on religious groups, including destroying religious sites, displacing whole communities and inflaming sectarian passions. Accordingly, Pew Research tallied the number, location and consequences of religion-related terrorism and armed conflict around the world, as reported in the same primary sources used to document other forms of intimidation and violence. However, war and terrorism are sufficiently complex that it is not always possible to determine the degree to which they are religiously motivated or state sponsored. Out of an abundance of caution, this study does not include them in the Government Restrictions Index. They are factored instead into the index of social hostilities involving religion, which includes one question specifically about religion-related terrorism and one question specifically about religion-related war or armed conflict. In addition, other measures in both indexes are likely to pick up spillover effects of war and terrorism on the level of religious tensions in society. For example, hate crimes, mob violence and sectarian fighting that occur in the aftermath of a terrorist attack or in the context of a religion-related war would be counted in the Social Hostilities Index, and laws or policies that clearly discriminate against a particular religious group would be registered on the Government Restrictions Index.
For the purposes of this study, the term “religion-related terrorism” is defined as premeditated, politically motivated violence perpetrated against noncombatant targets by subnational groups or clandestine agents that have some identifiable religious ideology or religious motivation. It also includes acts carried out by groups that have a nonreligious identity but target religious personnel, such as clergy. Readers should note that it is the political character and motivation of the groups, not the type of violence, that is at issue here. For instance, a bombing would not be classified as religion-related terrorism if there was no clearly discernible religious ideology or bias behind it unless it was directed at religious personnel. Religion-related war or armed conflict is defined as armed conflict (a conflict that involves sustained casualties over time or more than 1,000 battle deaths) in which religious rhetoric is commonly used to justify the use of force, or in which one or more of the combatants primarily identifies itself or the opposing side by religion.
As noted earlier, the primary sources indicate that the North Korean government is among the most repressive in the world, including toward religion. But because independent observers lack regular access to North Korea, the sources are unable to provide the kind of specific, timely information that forms the basis of this report. Therefore, North Korea is not included on either index.
This raises two important issues concerning potential information bias in the sources. The first is whether other countries that limit outsiders’ access and that may seek to obscure or distort their record on religious restrictions were adequately covered by the sources. Countries with relatively limited access have multiple primary sources of information that the Pew Research Center used for its coding. Each is also covered by other secondary quantitative data sets on religious restrictions that have used a similar coding scheme, including earlier years of coded State Department report data produced by Grim at Penn State’s Association of Religion Data Archives (ARDA) project (four data sets); independent coding by experts at the Hudson Institute’s Center for Religious Liberty using indexes also available from ARDA (one data set); and content analysis of country constitutions conducted by the Becket Fund for Religious Liberty (one data set). Pew Research staff used these for cross-validation. Thus, contrary to what one might expect, even most countries that limit access to information tend to receive fairly extensive coverage by groups that monitor religious restrictions.
The second key question – the flipside of the first – is whether countries that provide freer access to information receive worse scores simply because more information is available on them. As described more fully in the methodology in the baseline report, Pew Research staff compared the length of State Department reports on freer-access countries with those of less-free-access countries. The comparison found that the median number of words was approximately three times as large for the limited-access countries as for the open-access countries. This suggests that problems in freer-access countries are generally not overreported in the State Department reports.
Only when it comes to religion-related violence and intimidation in society do the sources report more problems in the freer-access countries than in the limited-access ones. However, the Social Hostilities Index includes several measures – such as SHI.Q.8 (“Did religious groups themselves attempt to prevent other religious groups from being able to operate?”) and SHI.Q.11 (“Were women harassed for violating religious dress codes?”) – that are less susceptible to such reporting bias because they capture general social trends or attitudes as well as specific incidents. With these limitations in mind, it appears that the coded information on social hostilities is a fair gauge of the situation in the vast majority of countries and a valuable complement to the information on government restrictions.
Data on social impediments to religious practice can more confidently be used to make comparisons among countries with sufficient openness, which includes more than nine-in-ten countries covered in the coding. An analysis by Grim and Richard Wike, the Pew Research Center’s director of global attitudes research, tested the reliability of the State Department reports on social impediments to religious practice by comparing public opinion data with data coded from the reports in previous years by Grim and experts at Penn State. They concluded that “the understanding of social religious intolerance embodied in the State Department reports is comparable with the results of population surveys and individual expert opinion.”5
New Checks on Potential Biases
Information contained in the sources used in this study varies from year to year. Sometimes this variation is systematic. For instance, as described on page 39, some organizations do not continue to update their reports.
It is also possible that the information sources used in this study are getting better or worse at reporting government restrictions on religion and social hostilities involving religion, potentially biasing index scores over time. Pew Research staff tracked this potential problem in two ways. First, they assessed the amount of detail contained in the sources, and second, they tracked the global coverage of the sources.
Assessing the Amount of Detail Contained in the Sources
The amount of detail in reports – as judged by overall word count – varies from year to year in some reports. For instance, the amount of coverage in an International Crisis Group report can change depending on the severity of the conflict or crisis in a given country.
Pew Research staff have been particularly concerned, however, with the possibility of underreporting. Specifically, the length of the U.S. State Department’s annual reports on international religious freedom – the most comprehensive source used in this study – has been substantially reduced. As shown in the table on page 48, word counts for the State Department’s International Religious Freedom (IRF) reports decreased substantially between this study’s baseline year (July 1, 2006, to June 30, 2007) and the most recent year of this study (calendar year 2012).6
The IRF report’s government sections were 24% shorter for the most recent year (254,016 words) than in the baseline year (332,517 words). In every region, the length of the government sections also has decreased since 2007, although the length increased in a few regions between 2011 and 2012.
The IRF report’s social sections were 16% shorter for the most recent year (60,081 words) than in the baseline year (71,682 words). Between 2011 and 2012, there was a slight decrease in the total amount of information available within the social sections, although the word count increased for three of the five regions (the Americas, Middle East-North Africa and sub-Saharan Africa).
The streamlined IRF reports tend to summarize incidents and trends rather than providing detailed lists of government restrictions and social hostilities, as they did in earlier reports. This introduces potential bias in the coding because coders record only concrete reports about specific government laws, policies and actions, as well as specific incidents of religious violence or intolerance by social groups; they do not rely on the commentaries or opinions of the sources.
Tracking the Global Coverage of the Sources
Beginning in the year ending in mid-2010, Pew Research staff have tracked the number of countries for which each source provided information on government restrictions on religion or social hostilities involving religion, as shown in the table on page 50. For instance, Human Rights Without Frontiers newsletters (source 19) provided pertinent information in fewer countries (67) in 2012 than in 2011 (70).
One possible sign of systematic bias would be if each source also declined in the number of countries where restrictions or hostilities were reported. But this was not the case. Seven of the sources provided information for a larger number of countries in the most recent year of the study than in the previous year, while 10 provided less coverage.
Mexico Case Study
As part of a previous study, Pew Research staff also examined whether the primary sources portrayed an inaccurate picture of religious restrictions and hostilities in a country. To assess this – albeit in a limited fashion – Pew Research staff compared the results of coding government restrictions on religion and social hostilities involving religion in Mexico using (a) the 19 primary sources for that study with (b) content analysis of Spanish language news reports about religious restrictions and hostilities.
To make this comparison, Spanish-speaking Pew Research staff analyzed the content of articles with reports of government restrictions on religion and social hostilities involving religion from the Mexican daily newspaper La Jornada.7 The analysis covered time periods identical to two covered by the Pew Research study: the baseline year (July 1, 2006, to June 30, 2007) and the fourth year of this study (July 1, 2009, to June 30, 2010).
La Jornada articles were selected for analysis if a headline made some reference to religion, in which case the article was coded using the same Government Restrictions Index (GRI) and Social Hostilities Index (SHI) questions used in this study. Specifically, the content analysis of La Jornada articles examined 18 of the 20 questions of the Government Restrictions Index and all 13 questions of the Social Hostilities Index. The two GRI questions excluded from the analysis were GRI.1 and GRI.2 because both related only to the constitution rather than to actions of the government or members and groups in society.
For instance, for the year ending in mid-2007, 10 La Jornada articles referred to some level of government using force toward religious groups (question GRI.Q.19), as shown in the table above. Content analysis of these 10 articles showed that the incidents affected 21 people or properties.8 For the year ending in mid-2010, five La Jornada articles referred to the use of government force toward religious groups, affecting a total of 112 people or properties. The information from the newspaper coincided with the coded scores from the sources used that year: that is, each had results within the range of 10-200 cases of government force.
The expectation at the start of this analysis was that a Mexican newspaper would have more reports of religious restrictions and hostilities than the study’s primary sources because a local source would be more aware of local incidents than the broader cross-national sources used by this study. Instead, the analysis found that the coded news from La Jornada was largely consistent with coding using this study’s primary sources.
While a similar comparison for other countries might not yield the same results – especially in countries where press freedom is more limited – this analysis provides some confirmation of the reliability of the Pew Research Center’s coding across years. This comparison also provides some evidence that the sources used by Pew Research in its coding neither over-estimated nor under-estimated the level of religious restrictions and hostilities in Mexico in the study’s baseline year and its fourth year. (More details on the comparison are available upon request.)
- numoffset=”39″ See the methodology of the Pew Research Center’s 2009 report, “Global Restrictions on Religion,” for a discussion of the conceptual basis for measuring restrictions on religion. ↩
- The September 2012 report provided scores for 197 countries and territories. This report includes South Sudan (which separated from Sudan in July 2011), bringing the total to 198 countries and territories. ↩
- See Grim, Brian J. and Roger Finke. 2006. “International Religion Indexes: Government Regulation, Government Favoritism, and Social Regulation of Religion.” Interdisciplinary Journal of Research on Religion, vol. 2, article 1. ↩
- The one member state of the United Nations not included in the study is North Korea. The sources clearly indicate that North Korea’s government is among the most repressive in the world with respect to religion as well as other civil and political liberties. (The U.S. State Department’s 2012 Report on International Religious Freedom, for example, says that “Genuine freedom of religion does not exist” in North Korea.) But because North Korean society is effectively closed to outsiders and independent observers lack regular access to the country, the sources were unable to provide the kind of specific, timely information that the Pew Research Center categorized and counted (“coded,” in social science parlance) for this quantitative study. Therefore, the report does not include scores for North Korea. ↩
- See Grim, Brian J. and Richard Wike. 2010. “Cross-Validating Measures of Global Religious Intolerance: Comparing Coded State Department Reports with Survey Data and Expert Opinion.” Politics and Religion, vol. 3, issue 1: 102-129. ↩
- In 2011, the State Department’s IRF reports shifted from annual periods beginning and ending mid-year to calendar years. The Pew Research Center’s coding periods made the same change. ↩
- Additional research assistance was provided by María Concepción Servín Nieto. ↩
- If multiple articles reported on the same incident, only the most comprehensive article was included in the coding. ↩