The One-Dimensionality of Econometric Data: The Frankfurt School and the Critique of Quantification
Scott Timcke
Independent Scholar, stimcke@gmail.com
Abstract: Econometric data are used to produce authoritative facts about the world. Yet, as numbers enjoy a central place in modern reasoning (particularly in government as their presumed objectivity and neutrality assist impartial decision-making), it is important that they receive scrutiny. Using methodological techniques from Western Marxism, with special reference to the work of Lukács, Horkheimer and Adorno, and Marcuse to inform a critique of Acemoglu and Robinson, I argue that the historical emergence of econometrics as a mode of mediated knowledge is a reified practice within the broader technical administration of social life, a practice that is not a transparent representation of social phenomena. This is because when econometrics transforms the thing being measured into a statistical indicator it eclipses political disputes with technical disputes, sidestepping good faith democratic deliberation about what goods are worth pursuing. Effectively, one-dimensional thought cannot perceive the origins of items put into circulation and so ideology is produced – what seems value-free is value-laden.
Keywords: econometrics, data, reification, ideology, numbers
Acknowledgement: A previous version of this article was
presented at the 2018 International Critical Theory Conference of Rome, Loyola
University Chicago. Special thanks go to Rick Gruneau, Beverley Best, Graham
Mackenzie, and the two anonymous reviewers.
Daron Acemoglu and James Robinson are among the
leading figures in contemporary American political economics.[1]
Their book Why Nations Fail was shortlisted for the 2012 Financial Times
and Goldman Sachs Business Book of the Year and included in the Washington Post’s
‘ten best books’ for the same year. Their previous book, Economic Origins of
Dictatorship and Democracy,
was similarly well received, being awarded the 2007 American Political Science
Association’s Woodrow Wilson Award. Allan Drazen called their book “truly
path-breaking” (2007, 163) and William Easterly
described it as “one of the most important contributions to the literature on
the economies of democracy in a very long time” (2007,
173). With this acclaim, it is fair to say that Acemoglu and Robinson represent
a predominant and prizewinning branch of political economic analysis conducted
in the United States, a kind of political economy especially concerned with
macro-economic growth.
One of their core beliefs is that the United States has a high degree of
democratisation because of its inclusive economic institutions (Acemoglu and Robinson 2012a, 74). In my view this
assessment is hard to sustain when considering that the American 1% owns 36% of
all private wealth, 40% of all financial wealth, 50% of stocks, and over 60% of
business equity (Wolff 2014; also see Wolff 2017; Saez and Zucman
2014). Indeed, together the three wealthiest American billionaires – Jeff
Bezos, Bill Gates and Warren Buffett – have more wealth than the bottom 50% of
the American population, nearly 165 million people (Collins
and Hoxie 2017, 2). It is not that Acemoglu and Robinson have simply made a
forgivable error about the nature of economic inclusivity, but rather that
their methodology is liable to generate these kinds of claims in the first
place. This is because they do not fully recognise that “economics is how
modern politics is conducted” (Timcke 2017, 2). This
vignette seeks to convey some of the characteristics of contemporary American
political economy, a field where econometric data has been employed to produce
authoritative facts about the world, but through methodological nationalism
papers over the extraction and transfer of surpluses from exploited regions.
As numbers enjoy a central place in modern reasoning, particularly in
government as presumed objective neutrality assists impartial decisions, it is
important that this ‘politics by quantities’ receives scrutiny. Using
methodological techniques from Western Marxism – with special attention to Lukács,
Adorno and Horkheimer, and Marcuse – I argue that the emergence of econometrics
as a mode of mediated knowledge is a reified practice within the broader
technical administration of social life, a practice that is not a transparent
representation of social phenomena. This is because when econometrics
transforms the thing being measured into a statistical indicator it eclipses
political disputes with technical disputes, sidestepping good faith democratic
deliberation about what goods are worth pursuing. Moreover, there are parallels
between the use of econometric models and Marx’s analysis of the commodity
form: one-dimensional thought cannot perceive the origins of items put into
circulation. What seems value-free is value-laden. And so, Marx’s insight that
bourgeois thought concerns itself with objects that arise either from the
process of studying phenomena in isolation, or from the division of labour and
specialisation in the different disciplines, remains valid. In effect, a ‘politics
by quantities’ dissipates the social question.
The goal in this article is to demonstrate how econometrics as a mode of
knowledge production understands, organises and controls social life the world
over. There are several steps involved in this argument. In Section 2 I review
how Acemoglu and Robinson, as emblematic of orthodox American political
economy, conceptualise their symbolic reasoning, and how this quantification
comes to mediate social phenomena, thereby determining them as objects. I build
upon these observations in Section 3 through undertaking a selective historical
analysis on the role of statistical inquiry during European state formation as
it relates to accomplishing economic growth. The remaining section employs
Western Marxism’s critique of quantification to highlight what is at stake in
the symbolic reordering of social life as well as what kinds of mystifications
are courted by econometrics. I conclude by offering thoughts on the importance
of leveraging this later critique to analyse the econometric practices in
contemporary data brokerage.
Why
Nations Fail uses narrative case studies to distinguish
between ‘inclusive institutions’ and ‘extractive institutions’; it is
nevertheless written in the tradition of institutional analysis and guided by
rational choice theory towards questions about the relative wealth of nations.
Even so, it can best be thought of as the simplified companion piece to the econometrically
dense Economic Origins of Dictatorship and Democracy. The institutional
analysis that Acemoglu and Robinson conduct concludes that concentrating power
within an elite almost always inhibits a country’s economic success, because
the elite enrich themselves at the expense of economic growth. By contrast,
inclusive institutions tend to be more successful in the long run because they
make pro-growth choices which in turn increase prosperity. This is how Acemoglu
and Robinson define these key concepts:
Inclusive
economics institutions, such as those in South Korea or in the United States,
are those that allow and encourage participation by the great mass of people in
economic activities that make the best use of their talents and skills, and
that enable individuals to make the choices they wish (2012a, 74).
They
add that inclusive economics institutions have a robust private property regime
backed by rule of law and a state bureaucracy willing and capable to enforce
contracts. This system permits capital and labour mobility (Acemoglu and Robinson 2012a, 74). By contrast,
Extractive
political institutions concentrate power in the hands of a narrow elite and
place few constraints on the exercise of this power. Economic institutions are
then often structured by this elite to extract resources from the rest of
society. Extractive economic institutions thus naturally accompany extractive
political institutions (Acemoglu
and Robinson 2012a, 81).
Keeping
these concepts and definitions in mind, in the Economic Origins of Dictatorship
and Democracy, Acemoglu and Robinson propose that democratisation and authoritarianism
depend on three key values:
·
The
cost of revolution, represented by the symbol µ;
·
The cost
of repression, represented by κ;
·
The inequality
of society, represented by the symbol θ.
Additionally,
other relationships can be expressed as such:
According
to their Proposition 6.3, if θ ≤ µ, then the status
quo prevails and “elites can stay in power without repressing, redistributing,
or democratizing” (Acemoglu
and Robinson 2006, 199).
In plainer terms, if the social costs of inequality are less than or roughly
the same as the social costs of a revolution, then elites can retain power
without the need for – or sufficient pressure to – implement egalitarian
reforms. In other words, for example, elites would not want to face the
prospect of higher taxes or other policies they do not want now or in the
future. However, if the social costs of inequality are higher than the social costs
of a revolution, then a new set of pathways emerges. Acemoglu and Robinson
delineate and express these options as:
1.
If µ
≥ µ* and κ ≥ κ*, repression is
relatively costly and so elites redistribute income to avoid revolution.
2.
Or if µ
< µ* and κ’ < κ* or κ’ ≥
κ* and the poor prefer strictly revolution to democracy, or if µ
≥ µ* and κ < κ*, then the elites use
repression to maintain the status quo.
3.
Or if µ
< µ*, the poor prefer weak democracy to revolution and κ
≥ κ*, then concessions are insufficient to avoid a
revolution and repression is relatively costly, then elites opt to democratize.
(Acemoglu and Robinson 2006, 199 [paraphrased for slight simplification]).
At
this point I want to pause and restate the above basic relationship in plainer
terms in order to make the reasoning more apparent. To begin, Acemoglu and
Robinson argue that when the social costs of inequality are higher than the
social costs of a revolution, elites are faced with three basic strategies.
First, given high levels of inequality, if the pressing costs of repression to
enforce this inequality are higher than the costs of redistribution, then
elites can stave off revolts by initiating democratisation efforts. This can be
in the form of redistributing incomes or offering concessions more favourable
to the poor majority. If these concessions are insufficient to stave off
revolution, a second strategy is that elites continue to repress the poor, for
no concessions will dissuade the poor from revolting. The third strategy is for
elites to minimise inequalities to an intermediate level to reduce the prospect
of a revolution and then offer credible commitment to reallocating power in the
future (see Acemoglu
and Robinson 2006, 26). More
recently, Acemoglu and Robinson have called this third strategy “the narrow
corridor” (2019).
These statements,
Acemoglu and Robinson believe, “[feature] all the essential elements of our
approach to democratization” (2006,
187; 181).
Based upon these econometric statements and a wide array of inputs from
multiple datasets, Acemoglu and Robinson’s policy prescriptions are construed
as merely the logical extension of technical deductions. And so, when
substituting the definitions and concepts into these econometric expressions,
one arrives at their conclusion that “democracy emerges as an equilibrium
outcome only in societies with intermediate levels of inequality” (2006, 199). The inclusion that comes from democratisation,
in the long run, returns higher rates of growth. Therefore, it is in the elite’s
best interest, if they prioritise wealth accumulation, to pursue this option. For
the poor, on the other hand, revolutions are difficult collective actions and
coordination problems to solve, as well as risking the destruction of productive
infrastructure and/or existing wealth. Accordingly, it is in their best
interest to accept the prospects of reduced inequality and the reorganisation
of power at a later date, while also benefiting from economic growth arising
from inclusion.
Acemoglu and Robinson’s
work has two important conclusions. First, when an elite or narrow ruling class
has near-unanimous control, they establish extractive institutions that benefit
themselves at the expense of other members of society. However, if control is
diffused, or there are checks and balances, higher growth will follow. Second, as
and when shocks occur, the kind of institution matters a great deal.. As Acemoglu
and Robinson write, “different political institutions lead to different
outcomes” (2006,
89).
Putting stock into the spectrum of extraction and inclusion, it follows that
collective bargaining power matters and is valuable on its own terms, as well
as increasing national economic performance.
Yet, despite these
insights, something is amiss. I think we can begin to see the problem when
undertaking a methodological comparison. Consistent between both Economic
Origins of Dictatorship and Democracy and Why Nations Fail is the
principle inversion of the Marxist account of institutions. While Acemoglu and
Robinson follow some materialist protocols, like the identification of class
struggle over distribution (2006,
20-21),
they have two principal objections to Marxian analysis. The first is that they disagree
with Marx’s materialist explanation about the mode of production producing the
superstructure – “it wasn’t technology driving the political organization of
society, but the political organization and institutions of society determining
what technology could be used” (2012b). Second, they regard Communism as “the new
absolutism of the twentieth century”, calling these regimes “brutal,
repressive, and bloody,” predicated upon “extractive institutions” (2012a, 431). They firmly hold the belief that Marxist
economic theory is in favour of looting the state, enriching the new elite, and
so on; that it is extraction under the guise of inclusion.
These criticisms reveal
the limits of their methods on their own terms, for they fail to appreciate
that it was not the ideological content of these communist institutions that
was the problem, but that they were authoritarian. These two characteristics
are not identical. Moreover, the USSR itself was an empire; imperial projects
are predicated upon extractive logics. A better approach to the study of states
and markets, including communist ones, would be to look at the historically
unfolding networks of combined and uneven development that do not privilege the
nation-state as the boundary of analysis, a task undertaken superbly by Walter
Rodney (1981) in How Europe Underdeveloped Africa,
or Perry Anderson (1974) in Lineages of the Absolutist State, for instance. In this way, one can
see that polities are not isolated entities, unmoored abstractions, but rather
are historically formed through and by material forces that permeate and pass
through their formal boundaries. In short, the objects that Acemoglu and
Robinson study are decontextualised to bracket out any contingency, while also
seeking to standardise the subjects of development. I argue that this is
because they are wedded to the notion of society-as-an-object, a dynamic that
emerges because of their strict adherence to formal quantitative reasoning.
Given that social
conditions shape what constitutes trustworthy or sufficient data collection, as
well as what constitutes a sound analysis of that data, a critique of econometrics
raises epistemological issues about economic practices, in particular on how
technological sophistication backed by institutionally based expertise like
that enjoyed by Acemoglu and Robinson produces intelligible explications. In
the remaining half of this article I outline some ramifications of this kind of
mediatisation. I shall first review selected contemporary historians,
anthropologists, and sociologists who critique econometric reasoning, as well
as the general consensus they reach. But while I think these scholars offer
considerable insights, I think their critiques are not radical enough.
Accordingly, I then turn to Western Marxism’s critique of quantification and
leverage to show how Acemoglu and Robinson’s work is a deep depoliticisation of
social questions.
Like
most modern sciences, statistics developed concurrently with European state
formation, meaning that the history of this disciplinary practice is inflected
by the era, notions of progress, conceptions of suitable kinds of things to
measure and so on. Beginning in the 17th century the development of central
government began to rely upon demographic calculations to govern increasing
complex societies. As William Davis writes, “Casting an eye over national
populations, states became focused upon a range of quantities […including] births,
deaths, baptisms, marriages, harvests, imports, exports, rice fluctuations.” Davies
sums up the reconfiguration thus: as parish registries became nationally
aggregated, “Statistics would do for populations what cartography did for
territory” (2017, 1). Like cartography, statistical governance
was tested in African colonies (Tilley
2011; Breckenridge 2014). This broader colonial gaze, James Scott
notes, was put in place by a diligent “civil society” to facilitate the
“administrative ordering of nature and society” and institute “the capacity for
large-scale engineering”, both deemed desirable elements of a “high-modernist view”
(1998, 5). By the early 20th century, the familiar
categories of analysis had been established, and had been put to service by European
states as well as by the bourgeoisie in the market.
To poach from John
Thompson’s analysis of the development of media, this rise of statistical
reasoning was “a reworking of the symbolic character of social life”, which
results in “a reorganization of the ways in which information and symbolic
content are produced and exchanged in the social world and a restructuring of
the ways in which individuals relate to one another and to themselves” (Thompson 1995, 11). In summary, by the mid-20th century the
entire basic repertoire of economic statistics was under consolidation, the
by-product of which was to produce a new kind of object for government; new
ways of manipulation and effects to be registered, all themselves products of
modernity.
Channelling the precept
that “statistical facts are produced by particular actors, in particular
contexts, with particular interests” (2001,
3), Adam
Tooze provides an excellent analysis of the post-war transformations of
statistical reasoning in economics. First, he identifies a great “global
standardization of the modern repertoire of macroeconomic statistics” that
included key variables like “national income, physical production, employment,
balance of payments, and volume of money in circulation, and the aggregate
price level” (2001, 4; 9-10). Consolidated in a “new empirical image of
the economy” this interest in statistical techniques related to “the production
of factual economic knowledge” (Tooze 2001,
4; 3). Second, this standardisation rapidly
diffused: right after the Second World War, nearly 40 states provided
assessments of national income, while a decade later 80 did. “The qualitative
change in data was dramatic”, he writes (2001,
8),
as it effectively rendered social questions (questions of unearned rents and
divides between labour and capital) irrelevant. Instead the economic
interpretation on national income emphasised productivity and the business
cycle. The bifurcation of the economy from social relations can be set in
contra-distinction to Marx’s interest in contesting share. The point is that
numerical representations aided the conceptualisation of the economy as growth
of national income, a feature that still haunts orthodox economic reasoning,
theory, and training.
Much of these elements
are reflected in John Maynard Keynes’ The General Theory of Employment,
Interest and Money. This text can be considered as emblematic of modern
macroeconomics, one that greatly enhanced a strand of macroeconomic thinking
that developed from the 1870s onwards. As Geoff Mann argues, the influence of
Keynes can be attributed less to his originality of research on, say, effective
demand or liquidity preference, and more to a receptive audience, ideologically
primed both for this message about an administratively engineered recovery of
capitalist accumulation and for the scientific expertise in which it was
delivered (see Mann 2017). Herein we see all the hallmarks of the
high modernism Scott identified. From this point one begins to see, as Tooze
writes, “the development of mathematical techniques for analyzing statistical
data and testing theory” (2001,
12).
By the 1990s, the
expansion of econometrics and quantitative modelling was one of the most
significant trends in economics and related disciplines, adopted in turn by
think-tanks and governments (see Lawrence
2010). Moving
from relatively basic assessments such as tallying votes or creating districts
for representation, to more complex assessments like the monitoring and
evaluation of public policy, to assessing equitable public spending in state
budgets, econometrics is entangled with calculability and control, bureaucratic
operations which draw upon evidence for evidence-based public policy, but which
really serve the reproduction of hegemonic structures of power and inequality.
Aside from these
political issues, epistemologically more pernicious errors occur when inducing
correlations using indicators as proxies for other variables, like GDP for
development, or Gini coefficients to stand in for elites’ instincts for
self-preservation or reform. As an example of how method creates explication,
consider GDP as an index of economic development. Nominally it is intended to
track the economic growth in a state. Nevertheless, Thomas Piketty notes that
this indicator “is a reflection of an era when the accumulation of industrial
goods was thought to be an end in itself, and to increase in production seen as
a solution to everything”. The problem of this indicator is that it does not
take account of the “depreciation of capital that made production possible”,
nor the “flow of profits between countries” (2017, 53; 54). These two oversights mean that per capita
incomes based on the GDP can be inflated, such that there is a systematic
underestimation of economic hardships. This is but one illustration of the
shortcomings of quantified indices. But the more fundamental objection is that
using an indicator like GDP reveals prior assumptions and post-hoc rationalisation
which simplify a complex array of value judgements, social processes, and
political contests. What remains is the common sense of the researcher: or, to
put it otherwise, their ideology and the reductions it courts.
At the level of research
practice, Morten Jerven writes: “If you ask an economist about the evidence
supporting their conclusions, they will direct you to the inferential
statistical results and tell you about coefficients of determination, statistical
significance and robustness tests”. Conversely, “if you ask a historian about
evidence, he or she will respond by telling you about the quality of the
primary observations” (2015,
16).
Jerven argues that econometricians commonly lack historical awareness; that
they could do with a dose of economic history. But his more important point is
that, due to the compromise of the data collection process, datasets bear no
resemblance to actually existing social life (Jerven 2013), and so the subsequent econometric
analysis, no matter how technically well executed, is not the mirror of
economic activity. What appears precise is anything but. It is for these
reasons that Jerven argues for a “political ethnography of indicators” that
traces “the line of causality from ‘data’ to ‘decisions’” (2016), and which can subject the numbers to
closer critical scrutiny to understand the conditions of their production and
dissemination.
The anthropologist
Sally Engle Merry has perhaps one of the best recent examples of this political
ethnography of indicators. For her, quantification of social life is a “mode of
governance” stemming from “the desire for accountability” (2016, 3). Quantification is a way to gather and
represent empirical knowledge, showing objectively how the world ‘really is’,
thus legitimating their use for political decision-making. All these elements
contribute to what she terms the “seductions of quantification”, that is, the
belief that “technocratic knowledge seems more reliable than political
perspectives in generating solutions to problems, since it appears pragmatic
and instrumental rather than ideological” (2016,
4).
But this not the case. As Bruno Latour and Steve Woolgar (1986) demonstrate in Laboratory Life,
numbers are created through a series of decisions with the aid of mathematical
models, their simplicity deflecting their constructed character. Likewise,
Alain Desrosières notes, quantified objects become “repeated in other
assemblages and circulated as such, cut off from their origins – which is after
all the fate of numerous products” (1998,
3):
their presumed objectivity and universality implies that they have a degree of
transferability across a range of contexts. In effect, numbers construct and
mediate the objects they represent. And, as with all mediations, there is the
possibility of deception and misperception. What I mean is that numbers create
and make visible the objects they measure. It is in this transformation that
numbers take on a life on their own; however, their apparent impartial use in
administrative processes has far-reaching consequences.
Orthodox political
economists are aware of and have somewhat responded to these critiques. For
example, Paul Romer has recently taken the discipline to task in his paper,
‘Mathiness in the Theory of Economic Growth’. “Mathiness lets academic politics
masquerade as science”, he writes. As pretence, “mathiness” allows “slippage
between statements in natural versus formal language and between statements
with theoretical as opposed to empirical content” (2015a, 89). There is merit to this point. Indeed,
Acemoglu and Robinson provide a case in point when they seize upon Marx’s
polemic adage “The handmill gives you society with the feudal lord; the
steam-mill, society with the industrial capitalist” to claim that Marxian
material analysis is a theoretical cul-de-sac. (If academic politics was the
standard, then I could reiterate Marx’s rhetorical barb that “Economists have a
singular method of procedure” and claim that as sufficient proof for definitive
argumentative victory.) Still, Romer’s solution is to swap academic politics
for ideal science, as it can bring “unique clarity and precision in both
reasoning and communication”. Indeed, he adds that “It would be a serious
setback for our discipline if economists lose their commitment to careful
mathematical reasoning” (2015b).
While I have a
qualified endorsement of this view, I do not think Romer’s proposal is grounded
in an adequate conceptualisation of the effects of quantification practices,
even in their ideal form. As Mary Morgan notes, “adopting a new reasoning style
into a science does not come without significant consequences for its content”
(2012, 17). Indeed, the methodological decisions that
econometricians use to devise models that test data to develop economic
theories themselves create explications (Morgan 1996, 263-264). For her, this act of creation is not
simply one of pure logic but permits ideological encoding to be integrated into
the means of inquiry. This is not to diminish the difficulty of econometric
model-making, nor to besmirch the skill and craft involved. Rather it is to
underscore the social components that also reside in the mode of analysis.
As an example of how
the social is encoded in a mode of inquiry, consider Geoffrey Bowker and Susan
Leigh Star’s observation that while “ordinarily invisible”, disputes about orthodox
statistical classification measures can become “fraught with political passion”
because symbolic and material dividends are consequences of categorisation (2000, 3; 4). These disputes demonstrate the extent to
which statistics have power in public discourse to skew life chances; why else
would they be an object of and instrument in struggles? For example, from her
study of high financial practices in the early 2000s, the kind of activities
that led to the 2008 sub-prime mortgage crisis, Saskia Sassen writes that
“assemblages of complex types of knowledge and technologies –including
algorithmic mathematics, law and accounting, and high-level logistics – have
generated complex predatory formations (2017,
1). Sassen
suggests that complexity hides this predation (becoming ordinary, to use Bowker
and Star’s terminology), and instead creates barriers for who can pose as
economic authorities.
Given the rise of data
brokerage as a sizable economic sector, the democratic critique of opacity, access,
and diversity in the analysis of data and its role in public life has merit (Pasquale 2015). But it is also incomplete. When complex
social issues are represented and addressed via quantities, the political
becomes technical, thus substituting for and discarding the kinds of democratic
discussions Jervens, Merry, Latour, Woolgar, Desrosières, and Sassen draw our
attention to. In other words, the quantification of social phenomena changes
the conceptualisation of distinction between the realm of the political and the
realm of the technical. This makes the quantification of social affairs even
more pernicious as it sublimates inherently political practices to render them
subject to formal logic.
To develop this theme further,
as well as to connect it to more foundational relations in capitalist realism, in
the next section I turn to Lukács’ ontology, which is central to his critique
of reification, a concept that figures prominently in the Frankfurt School
analysis of late modernity. Reification, I suggest, is at the foundation of the
ideological ontology econometrics serves. Thereafter I turn my attention to
Marcuse’s critique of one-dimensional society to draw links between the
underlying ‘laws of motion’ between 20th-century bureaucracy and 21st-century
econometric analysis.
When Morten Jerven writes that “Freedom
House actually does not measure ‘democracy’; that the Consumer Price Index does
not actually measure ‘inflation’; nor does Transparency International actually
measure ‘corruption.’ We just pretend ‘as if’ they do” (2016),
he is appealing to the concept of reification. Within Western Marxism, Georg
Lukács is well known for articulating and deploying this concept to sustain a
critique of the rational organisation of social life, this itself being
enfolded within capitalism’s maturation. He rather famously uses the clock as
an explanatory metaphor to discuss the rational control of labour. He says, “time
sheds its qualitative, variable, flowing nature; it freezes into an exactly
delimited, quantifiable continuum filled with quantifiable ‘things’” (Lukács 1971, 90). To simplify, he suggests that in
capitalist industrialisation social relations becomes objectified and
abstracted away, this process facilitated by conceptual systems wherein the
ruling class and their agents see labour-time as just another calculable
quantity in their ledgers. Here, “quantification is a reified and reifying
cloak over the true essence of the objects and can only be regarded as an
objective form” (1971, 166). Through adopting this
stance, the ruling class take on the “attitude of the experimenter” (1971, 131), believing that their positions give them
control, and that this control is “uninterested” in the social quotients of
production (1971, 166). In this permutation,
reification illustrates the epistemic error where the products of structural
forces cannot be treated as an isolated event, but part of a wider social
system.
To my mind, Lukács’ description of the quantification of human labour-time
being integral to capitalist production is an insight that can be extended to
quantification more broadly and econometrics in particular. Econometrics is an
exemplary methodological practice of the kind of abstract conceptual system which
objectifies and neglects social and political processes through the application
of duly deemed neutral and practical observation. To elaborate, as a ‘reified
and reifying cloak’, quantification constructs an object ready for technical
manipulation and bureaucratic recognition. And, much like reification,
quantification has ideological effects that mediate and constitute
relationships between subjects and objects in ways that call back to the
process of commodity fetishism. What I mean is that the history of the labour
process is eclipsed in the same manner that the commodity becomes the dominant
social form. In effect, Lukács is adamant that reification emerges out of the
kind of complexity where the distinction between the material and the
conceptual is obliterated.
Picking up on Lukács’ analysis, Horkheimer and Adorno repurpose it to
form a critique of rationality. This critique is not concerned with the
analytical method per se, but rather with a society that “equates
thought with mathematics” in the “assumption that the trial is prejudged”,
condemned to its own measure (2002, 18; 20). Their principal aim in the Dialectic
of Enlightenment is to argue that that unchecked rationality is
unreasonable. I understand them to mean that rationality becomes an “automatic
process” (2002, 19) when subordinated to positivism’s
tendency towards reification, in which a bifurcation places rationality in
opposition to irrationality, this dichotomy grafted onto a conception of
distinct modern and pre-modern modes of understanding. However, this
presumption is largely incorrect, as the rationalism of modern societies has
ritualistic mythical components, one of which is the deference to calculations
and qualification. They put it bluntly: “Mathematical procedure became a kind
of ritual of thought”. These short excerpts illustrate their awareness of how the
separation of the subject from the objects of technical practice results in the
‘equation of thought with mathematics’, a ritualistic process by which
subjective human testimony is subordinated to objective concrete numbers.
Quantification not only renders the numbers with an objective appearance, but
also with procedural neutrality. To call back to econometrics, the remedy is to
“grasp existing things as such, not merely to note their abstract
spatial-temporal relationships” (2002, 20).
So where political economists like Acemoglu and Robinson see precision,
critical scholars like Horkheimer and Adorno see alienation.
When presented in this fashion, it is easy to see how Lukács’
notion of reification has informed Horkheimer and Adorno’s analysis of rational
modelling, particularly in their shared concern regarding the severance of the
subject-object dialectic, as seen in the commodity form and in the
mathematization of society. This suggests that they all recognise that capital
social relations have generated an epistemology. Put succinctly, “Knowledge in
class-based societies is class knowledge”, as Christian Fuchs notes; this
characterisation “does not mean that the knowledge of the dominant class is
always false and the one of the dominated class always true (the opposite can
be the case), but rather that knowledge in class-based society is shaped by
struggles about how to and who can define reality” (Fuchs
2016, 89). To call back to econometric reasoning, while it has
institutional status and credence – and these certainly matter – the more
fundamental issue is that it is one prevailing means by which mathematization
and formal modelling comes to objectify the social world, thereby substituting
in class knowledge, assumption and axioms for a more grounded and dialectical comprehension
of the social world. It is, in other words, a kind of one-dimensional
methodology.
As I begin to conclude this article, I want
first to briefly address Marcuse’s critique of one-dimensional society to show
that econometrics helps to regulate capitalist ‘laws of motion’, rather than providing opportunities for
reflection and critique. Thereafter, I will return to and elaborate upon the
topic of econometrics as a neo-positivist method rooted in anti-dialectical
thought.
Marcuse’s critique begins with social changes in post-war American life
wherein procedural-pluralist liberalism and technocratic administration had
gained ascension. Whereas scholars like John Rawls and Robert Dahl saw the foreclosure
of struggles over first order value, Marcuse noted a contradiction where,
despite greater wealth, goods, and services, the workday had also increased and
intensified, meaning that workers could not benefit fully from this wealth, or
these goods and services. As opposed to procedural-pluralism, post-war American
capitalism, Marcuse proposed, had rather redeveloped mechanisms of rule to
contain and defuse revolutionary dissent. It took the form of converting any
specific deviance into general compliance; dissent became another means to reproduce
the capitalist order. What remained was mild transgressions and defiance,
actions alienated from any unconscious revolutionary spirit. The development of
‘repressive desublimination’ effectively removed sources to challenge the wider
dominant social structure: actions associated with these mild transgressions
are neither revolutionary nor emancipatory.
Sparing all but essentials, in Eros and Civilization, Marcuse
sought to explain repressive desublimination by weaving together Marx’s
conception of surplus labour – which demonstrates that capitalism rests on the
exploitation of the working class – and Freud’s argument of modernity as
inherently repressive elements which sublimate unconscious erotic desires or
instant gratification.[2] This
produced the concept of surplus repression. Like surplus labour, surplus repression
is over and over what is required for social reproduction; that is, its
function is to maintain unyielding capital accumulation by inducing labour
deference under demands of high productivity – here workers psychologically
internalise and act in accordance with capital’s interests, thereby naturalising
repression at the expense of acknowledging the unequal property relations
between themselves and capitalists. As such, surplus repression does little to
aid the worker and everything to aid the capitalist to increase their profits. Invoking
Friedrich Schiller, for Marcuse the solution was to rehabilitate art, which
would allow “a total revolution in the mode of perception and feeling”
(Schiller, quoted in Marcuse 1966, 189). As the task
in this article is less an appraisal of his solution, I will leave that kind of
extended assessment for another day. Suffice to say that intersubjective
harmony is necessary, as is the reconciliation of sense and reason, if the revolutionary
path to human fulfilment is to be achieved. One step on that journey requires
overcoming the reifications created by capitalist societies.
Having outlined how the quantification
within econometric reasoning is a reification that sets a stage for
anti-dialectical thought, it is worthwhile viewing a recent incarnation in the
long tradition of positivism. Positivism, for Adorno, is a standpoint with
“categories as simply given” that are generally subsumed by class relations.
This kind of subjectivism, as Habermas demonstrates in his essay ‘The
Analytical Theory of Science and Dialectics’, is but one standpoint seeking to
exclude whole areas of human knowledge that cannot be known through formal
methodological rules (1977, 137). In accounting for the development of this
complex social phenomenon that posits a rational ‘objective’ mode of
understanding, Marcuse writes that “Positivism shifts the source of certainty
from the subject of thought to the subject of perception. Scientific
observation yields certainty here” (1955, 351). All
told, positivism is founded on a specific conceptual set of ontological and
epistemological stances which presumes that subjects can stand adjacent to
ontology and epistemology and that conceptual elements are neutral rather than
neutralising their constitutive objects. This would certainly be a fair assessment of
the kind of political economy practiced by Acemoglu and Robinson.
To the extent that one can do justice to the
topic in an essay, it is worth contrasting positivism with Adorno’s conception
of dialectics, drawing primarily from Negative Dialectics. Set in
opposition to German idealism, whether Kantian or Hegelian, Adorno’s
materialism proposes that efforts to separate the subject and object are deeply
misguided: even more so when seeking to give priority of the subject. This is
because the subject is itself an object constituted by society more broadly,
that it could not exist without society. The task that Adorno sets himself,
then, is to break the prevailing deceptive fallacy of “constitutive
subjectivity” and instead promote “reconcilement” (2004,
xx; 6). One part of this larger task involves
reopening issues of metaphysics in philosophy; its counterpart is to undertake
an offensive against positivism.
This returns us to the important differences
between Lukács’ and Adorno’s respective stances on the conceptualisation of
knowledge more broadly. As Susan Buck-Morss notes, for Lukács, alienation was a
result of reification stemming from bourgeois society, that bourgeois society
was set on destroying culture by making artists unable to create a unity
between subject and object. Accordingly, Lukács put considerable stock in the
proletariat to create this unity as history unfolded and this class became the
agent for restoring a lost totality. Adorno vehemently disagreed. His
conception was that knowledge of history was also historical. This give rise to
his adage, “History is in the truth; the truth is not in history” (Adorno, quoted
in Buck-Morss 1977, 46). Indeed, as he indicates,
“dialectics [is] not a standpoint.” (Adorno 2004, 4).
At the risk of broad generalisation, in the
18th and 19th centuries, political economy was predominantly a verbal science,
its subsequent Marxian critique very much marginalised from the academy. By the
late 20th and early 21st century it became mathematized with bounded formal
modelling of financial transactions, decisions within firms, and national
economies becoming standard practice.
Granted, there are many varieties of political economy being practiced today,
ranging from constitutionalism, social choice and public economics, to
macroeconomics, historical developmental, and international political economy
(see Weingast and Wittman 2006);
nevertheless, complex statistical modelling is central to effective governance,
a vital component of technical administration and control. This holds even in
democratic governance. This development has given rise to a technocratic elite
with its own languages of expression and ways of reasoning that form an
epistemic genre. This connection is made via the application of calculability, using
mathematics to present what appears to be a formal logic. Yet the excision of
Marxian critiques has been very much to the detriment of making political
economy a critical social science.
Econometrics is but one of the more recent examples in the history of
quantification practices. Herein social affairs are treated as objects ripe for
impartial – and thus authoritative – technical manipulation, thereby mystifying
the social realm. The shorthand expression of this is to say that econometrics
is a positivist rendering of the social structure, seeking to rearrange the
material world in its own image by pursuing a mathematical characterisation of
social life. It is, as I have suggested, a depoliticisation of the social
question rendered through the dominance of anti-dialectical thought. This
weakness is papered over by mathematical sophistication, institutional clout,
and ideology, all on display in the reception of Acemoglu and Robinson’s
analysis and method.
To be clear, this is not to say that these numbers cannot at times be
useful or have practical utility. As the progress towards the Millennium
Development Goals illustrates, technical operations using quantities can help
to promote human flourishing. Rather it is to say that numbers can function as
a form of class knowledge, which in turn shapes reality. In late capitalism the
reasonable bounds of quantification have been unreasonably extended to all
areas of human life, seeking to capture and reduce senses and experiences.
Quantification, with its aura of objectivity and neutrality, is just the most
recent incarnation of an influential intellectual lineage within modernity seeking
to construct and administer objects in a technical manner. Motivating this
extension is the spectre of positivism, so naturalised that it is almost
unperceivable, but still very much present.
In this article I have argued that the anti-dialectical standpoint
provides goods suspect of ritualistic quantification and mathematical modelling
in econometrics. Being a mode of analysis severed from questions about the
origin of its production, the reason for its circulation, and its class
character, it is important to pay attention to the kinds of objects that
econometrics produces, as there are sociological consequences of a social world
structured by this symbolical mediatisation. For while econometrics appears to
demonstrate the apparent object authority of data, the skilful manipulation of the
latter demonstrates expertise that allows one to control the administration of
political subjects. This technical operation does not fully permit a discussion
about human values through a framework where there is little prospect of reconcilement. I am hopeful that this will change, but
change has to tackle the fetish in the wider computational turn currently under
way in the social sciences, a turn where modelling and quantification comes at
the expense of studying the history of social processes. The primary task
ahead, as I see it, is to find the opportunity and means to insert dialectical
thought into the wider discussion about data analysis for social justice, or to
assess if this task is even possible.
Acemoglu, Daron and James Robinson. 2012b. How Marx Got it Wrong [blog post]. Accessed 27 March, 2020. http://whynationsfail.com/blog/2012/5/11/how-marx-got-it-wrong.html
Adorno, Theodor. 2004. Negative Dialectics. Translated by E. B. Ashton. London: Routledge.
Adorno, Theodor. 1977. Introduction. In The Positivist Dispute in German Sociology, by Theodor W. Adorno, Hans Albert, Ralf Dahrendorf, Jürgen Habermas, Harald Pilot and Karl R. Popper, 1-67. Translated by Glyn Adey and David Frisby. London: Heinemann.
Anderson, Perry. 1974. Lineages of the Absolutist State. London: Verso.
Buck-Morss, Susan. 1977. The Origin of Negative Dialectics. London: The Free Press.
Collins, Chuck and Josh Hoxie. 2017. Billionaire Bonanza: The Forbes 400 and the Rest of Us. Report for the Institute for Policy Studies, November 2017. Accessed 27 March, 2020. https://ips-dc.org/wp-content/uploads/2017/11/BILLIONAIRE-BONANZA-2017-FinalV.pdf
Davies, William. 2017. How statistics lost their power – and why we should fear what comes next. The Guardian, 19 January. Accessed 27 March, 2020. https://www.theguardian.com/politics/2017/jan/19/crisis-of-statistics-big-data-democracy
Fuchs, Christian. 2016. Critical Theory of Communication. London: University of Westminster Press.
Habermas, Jurgen. 1977. The Analytical Theory of Science and Dialectics. In The Positivist Dispute in German Sociology, by Theodor W. Adorno, Hans Albert, Ralf Dahrendorf, Jürgen Habermas, Harald Pilot and Karl R. Popper, 131-162. Translated by Glyn Adey and David Frisby. London: Heinemann.
Jerven, Morten. 2016. Review of Cooley,
Alexander; Snyder, Jack, eds., Ranking the World: Grading States as a Tool of
Global Governance. H-Diplo, H-Net Reviews
[website]. http://www.h-net.org/reviews/showrev.php?id=45200
Jerven, Morten. 2015. Africa: Why Economists Get It Wrong. London: Zed Books.
Lawrence, Peter. 2010. Development by Numbers. New Left Review 62, March-April: n.p.
Lukács, Georg. 1971. History and Class Consciousness. Cambridge, MA: The MIT Press.
Mann, Geoff. 2017. In The Long Run We Are All Dead. London: Verso.
Marcuse, Herbert. 2002. One Dimensional Man. Routledge: New York.
Marcuse, Herbert. 1966. Eros and Civilization. Beacon Press: Boston.
Marcuse, Herbert. 1955. Reason and Revolution. Routledge & Kegan Paul Ltd: London.
Morgan, Mary. 2012. The World in the Model. Cambridge: Cambridge University Press. 2012.
Morgan, Mary. 1996. The History of Econometric Ideas. Cambridge: Cambridge University Press.
Pasquale, Frank. 2015. The Black Box Society. Cambridge, MA: Harvard University Press.
Piketty, Thomas. 2017. Why Save the Bankers? Translated by Seth Ackerman. Boston: Mariner Books.
Rodney, Walter. 1981. How Europe Underdeveloped Africa. Washington: Howard University Press.
Romer, Paul. 2015b. My Paper “Mathiness in the Theory of Economic Growth”. https://paulromer.net/mathiness/
Scott, James, C. 1998. Seeing Like a State. New Haven: Yale University Press.
Thompson, John. 1995. Media and Modernity. Cambridge: Polity Press.
Timcke, Scott. 2017. Capital State Empire. London: University of Westminster Press.
Wolff, Edward. 2017. A Century of Wealth in America. Cambridge, MA: Harvard University Press.
Scott Timcke
Scott Timcke is a comparative historical
sociologist interested in the study of race, class, and inequality. His
approach to these topics is greatly shaped by South African and Caribbean
critiques of the Anglo-American liberal tradition.
[1]
To give some indication of
their clout, Acemoglu and Robinson’s work has been cited 150,221 and 77,139 times respectively, although there is
some overlap because they publish together. See their Google Scholar profiles
at https://scholar.google.com/citations?user=l9Or8EMAAAAJ&hl=en and https://scholar.google.com/citations?user=rNHDppMAAAAJ&hl=en. Based upon the RePEc
bibliometrics, Acemoglu is the most cited economist of the past decade.
Robinson is in 101st place, which is also impressive given that he identifies
his field as and publishes in political science. See https://ideas.repec.org/top/top.person.all10.html.
[2]
Marcuse’s analysis rejects the
Freudian conception of the necessity of the Reality Principle’s trumping the
Pleasure Principle, indeed favouring the normative conception of de-alienated
labour sketched by Marx. More generally, for Frankfurt
School theorists, the universality of Freud’s pessimism is a conservative
foreclosure to the very possibility of revolutionary action; neither is it
attuned to ‘the whole man’.