Winter Night

VARIABLE DEPEDENCE

Debating the merits of large- and small-N studies

Sample size does more than determine the sort of methodology appropriate for a given study; theorists of social science have long pointed out that the number of case studies considered determines the sorts of questions researchers can analyze and the structure of their causal claims.

A 2003 paper by PETER HALL takes these debates further. In the context of comparative political science, Hall argues that the sort of methods researchers use should be consistent with their beliefs about the nature of historical development. From the paper:

“Ontology is crucial to methodology because the appropriateness of a particular set of methods for a given problem turns on assumptions about the nature of the causal relations they are meant to discover. It makes little sense to apply methods designed to establish the presence of functional relationships, for instance, if we confront a world in which causal relationships are not functional. To be valid, the methodologies used in a field must be congruent with its prevailing ontologies. There has been a postwar trend in comparative politics toward statistical methods, based preeminently on the standard regression model. Over the same period, the ontologies of the field have moved in a different direction: toward theories, such as those based on path dependence or strategic interaction, whose conceptions of the causal structures underlying outcomes are at odds with the assumptions required for standard regression techniques.

The types of regression analyses commonly used to study comparative politics provide valid support for causal inferences only if the causal relations they are examining meet a rigorous set of assumptions. In general, this method assumes unit homogeneity, which is to say that, other things being equal, a change in the value of a causal variable x will produce a corresponding change in the value of the outcome variable y of the same magnitude across all the cases. It assumes no systematic correlation between the causal variables included in the analysis and other causal variables. And most regression analyses assume that there is no reciprocal causation, that is, that the causal variables are unaffected by the dependent variable. The problem is that the world may not have this causal structure.

Small-N comparison is therefore far more useful for assessing causal theories than conventional understandings of the ‘comparative method’ imply. Precisely because such research designs cover small numbers of cases, the researcher can investigate causal processes in each of them in detail, thereby assessing the relevant theories against especially diverse kinds of observations. Reconceptualized in these terms, the comparative method emerges not as a poor substitute for statistical analysis, but as a distinctive approach that offers a much richer set of observations, especially about causal processes, than statistical analyses normally allow.”

Link to the piece.

  • “Except for probabilistic situations that approach 1 or 0 (in other words, those that are almost deterministic), studies based on a small number of cases have difficulty in evaluating probabilistic theories.” Stanley Lieberson’s 1991 overview of the causal assumptions inherent to small-N studies. Link.
  • Theda Skocpol and Margaret Somers on “The Uses of Comparative History in Macrosocial Inquiry.” Link.
  • Jean Lachapelle, Lucan A. Way, and Steven Levitsky use small-N process tracing to “examine the role of the coercive apparatus in responding to crises triggered by mass anti-regime protest in Iran and Egypt.” Link. Andrey V. Korotayev, Leonid M. Issaev, Sergey Yu. Malkov and Alisa R. Shishkina present a quantitative analysis of destabilization factors in 19 countries during the Arab Spring. Link.

New Researchers

Intangible capital and the labor share

University of Minnesota PhD candidate LICHEN ZHANG examines how micro-level heterogeneities generate varied macroeconomic outcomes. Her job market paper considers the relationship between increased investment in ‘intangible’ capital such as software and R&D, and the decline in the labor share of income.

From the paper’s introduction:

“I find that when intangible-investment specific technical change (IISTC) is calibrated to match the observed decline in the relative price of intangible investment goods, my model can explain both the observed decline in the current BEA-measured labor share by around 60% and the decline in measured labor share of the pre-1999 revision—when intangibles are not treated as final output—by around 40%. Moreover, it accounts for more than 90% of the observed increase in the employment share of large firms (with more than 500 employees) and of the increase in the share of total sales going to the top 10% firms. In addition, the model predicts more than two-thirds of the observed reduction in annual firm entry rate.”

Link to the paper, link to Zhang’s website.

Each week we highlight great work from a graduate student, postdoc, or early-career professor. Have you read any excellent research recently that you’d like to see shared here? Send it our way: editorial@jainfamilyinstitute.org.

+ + +

  • Two new posts on the Phenomenal World this week. In the first—which we published in collaboration with the Los Angeles Review of Books—Jack Gross interviews the legendary historian of science Lorraine Daston. “No universal ever fits the particulars. Never in the history of human rulemaking have we created a rule or a law that did not stub its toe against unanticipated particulars.” Link.
  • As a companion to the above, Rodrigo Ochigame surveys the long history of attempts at formalizing fairness: “Algorithmic fairness should be understood not as a novel invention, but rather as an aspiration that reappears persistently in history.” Link.
  • Led by Arden Ali, JFI’s Digital Ethics team is co-organizing a conference in April with MIT’s Social and Ethical Responsibilities of Computing Working Group: “The Ethics of Algorithms.” Read the call for abstracts from all academic disciplines here.
  • From the always-excellent Carbon Brief, an interactive showing how the UK has transformed its electricity supply since the 2008 Climate Change Act. Link.
  • “We confront two seemingly-contradictory observations about the US labor market: the rate at which workers change employers has declined since the 1980s, yet there is a commonly expressed view that long-term employment relationships are more difficult to attain. We reconcile these observations by examining how the distribution of employment tenure has changed in aggregate and for various demographic groups.” By Raven Molloy, Christopher Smith, and Abigail Wozniak. Link.
  • On college access and adult health. By Benjamin Cowan and Nathan Tefft. Link. h/t Laura
  • “Until the mid-1930s, the 12 Federal Reserve banks had the ability to set their own discount rates and conduct independent monetary policy.” Pooyan Amir-Ahmadi, Gustavo S. Cortes, Marc D. Weidenmier on regional monetary policy and the Great Depression. Link.
  • “Strong Firms, Weak Banks: The Financial Consequences of Germany’s Export-Led Growth Model.” By Benjamin Braun and Richard Deeg. Link.
  • Related to last week’s spotlight on industrial policy, a 2004 paper from Ricardo Hausmann, Lant Pritchett, and Dani Rodrik on “Growth Accelerations.” Link.
  • Jan Fichtner “compiles the first ‘anatomy’ of the Cayman offshore financial center (OFC), utilizing all sources of publicly available data about the three main segments: banking, direct investment, and portfolio investment.” Link.
  • A paper by Giovanni Dosi et al presents an agent-based model to explain sluggish wage and productivity growth, and increased dispersion of firm productivity and worker earnings. “It shows that a single market process unleashed by the decline of unionization can account for both the macro and micro economic phenomena.” Link.
  • “Can new technology cause social instability and unrest? We look at one famous historical episode—the ‘Captain Swing’ riots in 1830s England. These constitute the largest wave of political unrest in English history, with more than 3,000 cases of arson, looting, attacks on authorities, and machine-breaking across 45 counties. Newly-collected data on threshing machine adoption shows that new technology was associated with both higher unemployment and more riots. Based on data about soil fertility and suitability for water power, we also show that the link was causal: Areas with more water power and better conditions for wheat cultivation witnessed both greater adoption of threshing machines and markedly more riots. In parishes where the technology was not adopted, the riot probability was 13.6%; in places where threshing machines had spread, it was 26.1%—twice as high.” By Bruno Caprettini and Hans-Joachim Voth. Link.

Each week we highlight research from a graduate student, postdoc, or early-career professor. Send us recommendations: editorial@jainfamilyinstitute.org

Subscribe to Phenomenal World Sources, a weekly digest of recommended readings across the social sciences. See the full Sources archive.