There's been a good bit conversation recently about the merits and demerits of "public philosophy" and, as someone who considers herself committed to public philosophy (whatever that is). I'm always happy to stumble across a piece of remarkably insightful philosophical work in the public realm.  Case in point:  Robin James (Philosophy, UNC-Charlotte) posted a really fascinating and original short-essay on the Cyborgology blog a couple of days ago entitled "An attempt at a precise & substantive definition of 'neoliberalism,' plus some thoughts on algorithms." There, she primarily aims to distinguish the sense in which we use the term "neoliberalism" to indicate an ideology from its use as a historical indicator, and she does so by employing some extremely helpful insights about algorithms, data analysis, the mathematics of music, harmony, and how we understand consonance and dissonance.  I'm deeply sympathetic with James' underlying motivation for this piece, namely, her concern that our use of the term "neoliberalism" (or its corresponding descriptor "neoliberal") has become so ubiquitous that it is in danger of being evacuated of "precise and substantive" meaning altogether.  I'm sympathetic, first, as a philosopher, for whom precise and substantive definitions are as essential as hammers and nails are to a carpenter. But secondly, and perhaps more importantly, I'm sympathetic with James' effort because as Jacques Derrida once said "the more confused the concept, the more it lends itself to opportunistic appropriation."  Especially in the last decade or so, "neoliberalism" is perhaps the sine qua non term that has been, by both the Left and the Right, opportunistically appropriated.

James' definition of neoliberalism's ideological position ("everything in the universe works like a deregulated, competitive, financialized, capitalist market") ends up relying heavily on her distinction of neoliberalism as a particular type of ideology, i.e., one "in which epistemology and ontology collapse into one another, an epistemontology." In sum, James conjectures that neoliberal epistemontology purports to know what it knows (objects, beings, states of affairs, persons, the world) vis-a-vis "the general field of reference of economic anaylsis."


Neoliberal epistemontology presumes, first, that everything that can be known is known best on the model of market-logic. But, as James (and Foucault) note, the coherency of that epistemological presumption depends upon, includes and often veils another, almost indistinguishable, ontological presumption: namely, that everything that is is a market. Of course, such "epistemontologies" end up being massive feedback loops, philosophically speaking; they produce and reproduce the very phenomena that they claim to be simply discovering and analyzing. (Fwiw, I think there are some strains of psychoanalytic theory that suffer the same fate.) The fact that neoliberalism, as an ideology, is first and foremost an epistemontology of this sort provides James a way of explaining why most of the historical manifestations of it (she includes "big data, post-identity politics, globalization, creative destruction, resilience, sustainability, privatization, biopolitics, relational aesthetics") are consistently understood through algorithms. Neoliberals, believing that all that is is a market, are ever in search of better predictive mathematical formulas for understanding how the agents of that market will freely and rationally determine their interests and direct the market… and/yet/but, by virtue of those same algorithmic analyses, neoliberals also end up manipulating the market, its agents, and whatever remains of what we take to be the "freedom" involved in "free choice."

One of the most interesting parts of James' essay, to my mind, is her (excellent, but all too brief!) explanation of the prominence neoliberal ideology affords to algorithms. Neoliberal economic analyses, to quote James, "find the signal in the noise" of phenomena and human behavior by combining two sets of ideological commitments: (1) a commitment to particular epsitemontological presumptions (the world/reality is a market, agents in the world are intentionally rational, agents' behaviors are systematic/non-random/predictable and, thus, can be known/understood) and (2) a commitment to algorithmic analysis, constant mathematical modeling, which is itself necessitated by the presumptions of (1). James teases out the implications with a musical metaphor: one way to understand harmony is as a consequence of "phase convergence" (when wave forms with different frequencies fall into sync); if we understand individuals as distinct wave forms with different frequencies, as neoliberalism does, then we can allow for the possibility of "social harmony" without needing to collapse the distinct wave forms into one another or erase their difference in frequency. Metaphorically, neoliberalism can understand social harmony as something that "naturally" occurs in phases– asynchronous things will, over time, fall in and out of sync with each other– without sacrificing neoliberalism's commitment to the idiosyncratic, free, rational intentionality of individual agents. Thus, "achieving" social harmony, if that is a legitimate project at all, ought not be a project of regulating individuals so that they operate more in sync with one another, but rather staying out of their way. (Don't tread on me!) Of course, the great irony evident in neoliberals' ubiquitous efforts at data-collection– their constant, relentless and mostly covert encroachment into our "private" lives– is that such efforts are justified on the basis of safeguarding our individual freedom to engage in the market according to our own interests, as those interests are freely determined by us.

Never mind that what an uncritical surrender to algorithmic analyses actually does– little by little, Google search by Google search, Facebook like by Facebook like, Amazon purchase by Amazon purchase– is eventually come to determine not only our interests, but also our "freely, intentionally rational" selections among them.

To the extent that there's anything really missing in James' argument– and, to be fair, hers is a very short piece that does not pretend to offer a full analysis– I think it's an under-emphasis on another presumption of neoliberal epistemontology: the market (which we ought remember, for neoliberals, is all that is the case) is ever guided by an Invisible Hand.  James' focus on algorithms and mathematical modeling is immensely valuable for understanding many of the epistemological commitments and strategies of neoliberal epistemontology, but I'd just like to unpack the implications of the ontological (or, really, onto-theological) commitments of the neoliberal "reality-as-market" worldview briefly here.

Perhaps the single most important proposition in modern capitalist economic theory, inherited from Adam Smith, is that competitive markets do a good job of allocating resources, that such markets channel individuals' self-interest toward the collective good as if directed by an "invisible hand."  (I won't detail the manner in which such a proposition qualifies as "onto-theological" here, partly because there simply isn't room to do so, but mostly because I think it is self-evident.) There is, of course, a long and varied history of philosophical and/or religious commitments to the world-as-purposive or the world-as-Good or the world-as-intelligently-designed.  Despite their differences in detail, and despite their sometimes outright antagonisms, what they share in common is a certain, fundamentally ontological, inflection that posits all that is the case as aiming-to-be or destined-to-be orderly, rational, if not also just and morally good.

One of the problems with neoliberalism's particular ("invisible hand") iteration of onto-theological prejudice– and this is something that James' account of the neoliberal "algorithmic modelling" fetish made more clear to me– is that it effectively blinds itself to the manner in which it not only does, but must, conflate the Hand-that-Guides with the hand(s)-that-are-guided.  When synchronicity or harmony is absent, when dissonance is resonant, when the aleatory interrupts or real human freedom (s'il y en a) insists– that is to say, when the Invisible Hand is not only non-apparent but also non-existent— neoliberalism's epistemonto(theo)logical commitments force neoliberals to, quite literally, phish or cut bait.  And what is phishing, after all, but the manufacturing of an Invisible Hand?

What are drones, for that matter?

So, perhaps (but not really) pace James, I'm not convinced that neoliberalism is as passive with regard to "social harmony" as her analysis might suggest.  (For the record, I don't think she meant to suggest that neoliberalism is passive and I'm confident that she doesn't think that.)  Neoliberals aren't simply playing around with predictive algorithms and waiting for a harmonic or synchronous phase convergence–that is NOT James' thesis, for the record– but rather, I suspect, neoliberals' epistemontological commitments put them squarely in the seat of the remote-operator of a drone we might call "Invisible Hand."  And, not to put too fine a point on it, but the "Invisible Hand" drone is a deadly effective weapon that basically works like this: defund or deregulate, make sure things don't work, wait for people to get angry, then privatize.  That's the formula Noam Chomsky detailed in his brilliant essay "The State-Corporate Complex: A Threat To Freedom and Survival", in which he also sagely reminded us that the only occurrence of the phrase "invisible hand" in Adam Smith's Wealth of Nations appears in a passage that critiques what we now call neoliberalism.

To wit, all this has inclined me to think that the customary use of the Gadsden flag ("Don't tread on me!") to represent neoliberalism is perhaps not as appropriate as opting instead for the Franklin woodcut ("Join, or Die") that I used at the top of this post.  "Join, or Die" seems to be far more indicative of the neoliberal imperative, shouted into the panopticon of our modern world and echoed off every wall by banks, political parties, corporations, families, nation-states, social groups and social media. I think it's consistent with James' Foucaultian-inspired insights to say that the post-9/11 neoliberal project determines even more than what Foucault conjectured contemporary notions of nation-state "sovereignty" determine.  The sovereign nation-state determined "[who] to make live and [who] to let die," but neoliberal entities– hardly ever nation-states anymore– determine who to make live and who to make die.  Because "living" is utterly unrecognizable except as an algorithmic variable by big neoliberal data, there is no "living" that is not "joining."

And there is no not-joining without dying.

[Originally posted at ReadMoreWriteMoreThinkMoreBeMore]

Posted in , , , , , , , , , , ,

9 responses to “Join, or Die: Neoliberalism, Epistemontology, Social Harmony and the (Invisible) Invisible Hand”

  1. Gordon Avatar
    Gordon

    Hi Leigh, thanks for the post – this is really productive. I also think I disagree on one of the threads running through your (and Robin’s) analysis: the epistemontology. I realize that to be persuasive on the point, I’ll need to say more than I can in a comment, but my sense is that neoliberalism has about as thin an ontology as one can. In particular, I don’t think there’s a commitment either to the naturalness of markets or the naturalness/inevitability of individuals behaving as dictated by economic rationality. I take it that the break with Adam Smith-style liberalism (and I’m getting this from Foucault) is precisely the refusal to think that markets occur spontaneously. For the second, there’s quite a lot of effort being expended to try to cajole (with ideological self-help books), force (with digital rights management schemes, converting pension plans into 410k’s, etc), nudge (shoppers cards which are the necessary condition for getting a sale-price), and so on. For people that prove sufficiently recalcitrant, we through the entire weight of the punitive apparatus at them (I’m getting this last point from Loic Waicquant).
    It of course matters who one is talking about – Coase is not Becker is not Hayek and so on. But my sense of Hayek, for example, is that the argument for markets is based on the efficiency of the price mechanism. But at least in a text like Road to Serfdom, he’s prepared to make the normative argument in terms of the rule of law, and he doesn’t think either is inevitable.
    There are folks who try to ontologize information – Luciano Floridi comes to mind. And so if that project were successful, you’d get to the big data endpoint by other means… but I don’t think that project is going to work out, either…
    One problem is the one Robin identified – the word neoliberalism means a lot of things to a lot of people.

    Like

  2. Robin James Avatar
    Robin James

    Hi Gordon–I don’t think our claims about ‘ontology’ are incompatible. What I’m trying to argue (and maybe not clearly enough yet) is that algorithmic modeling is what does the naturalizing and rationalizing. It doesn’t at all matter if individuals actually behave in any systematic way, as long as we can find a model that sufficiently predicts their behavior. So unlike Kant’s genius, who acts as if he’s NOT following a rule (i.e., as if he’s just following ‘nature’), homo economicus acts (from the perspective of those analyzing his behavior) as if he IS following a rule (even and especially because he’s not). The ‘ontology’ I’m talking about is this idea that there is a law/rule/ratio/signal in this noise, if we can just find it. And, in order to find it, we may need to, you know, engineer and process the hell out of that transmission (with sel-fhelp books, sales, etc).
    And while it is really hard to nail down a definition bc there are so many people using the term in so many ways, I think (hope?!) we can come to some sort of consensus about ‘neoliberalsm’ like we have about, say ‘social contract theory.’ With the latter, we recognize that there’s a broad diversity among specific contractarians (Hobbes is pessimistic and authoritarian, Rousseau believes in the general will, Locke is who you want to look to for property in the person, and then there are minor figures like Grotious etc…), there’s still the general, intro-to-phil level sense of the social contract as “agreeing to give up some liberty in exchange for some security.”

    Like

  3. sk Avatar
    sk

    Thanks for this, Leigh and Robin (and Gordon!). I am really intrigued by this ‘epistemontology’ idea, but I have reservations – or rather, questions – of the sort proposed by Leigh and by Gordon. However, the ‘as if’ you note here Robin – echoing the ‘as if’ Becker uses in his 1962 article, that Foucault picked up on – seems to be effecting this shift. In the Becker piece, whether individuals actually made rational choices or irrational ones (utility-maximizing choices) was irrelevant; at one level of abstraction away, they still came out as rational, so whatever. This ‘as if’ either fictionalizes ontology, or shifts ontology out of the individual choice maker (the one who is called to join or die) and onto/into the model, which before was meant only to have a descriptive power (i.e., an epistemological function). So there does seem to be a confusion or mixture of ontology and epistemology at work here, that occurs through a fiction, this ‘as if.’ Moreover, if the value of a social science is in its predictive power, then there is a huge incentive to make the models predictive, and to use the models to nudge, or to force, individuals to show themselves to be sufficiently responsive to the models. This incentive only gets bigger with the huge monetary rewards available to academic social scientists as consultants of various kinds.
    It looks like, much like liberal political theory has been critiqued as a displacement of politics, neoliberal theory is a displacement of ontology – from the model of the individual, to the model of the model. Or perhaps, that of the algorythm – which seems yet another step abstracted away (or perhaps an algorhythm is just a model in motion).
    Am I on the right track in fleshing out this epistemontology thingie?

    Like

  4. Gordon Avatar
    Gordon

    Hi Robin, I’m not sure if you and I agree or not – this could be a question of emphasis. I definitely agree w/at least two aspects of your larger point – that it would be useful to have some sort of baseline definition of neoliberalism of the sort you cite for social contract theory. I also think that it’s substantially the move to statistical analysis that’s central (I’m hesitating about the word ‘algorithm’ on technical grounds – I did a lot of computer programming in high school, and so i have a fairly specific notion of what an algorithm is. I need to think more about it (Katherine Hayles has a fantastic chapter on the need for a critical theory of code – she basically shows how grammatology works for writing but not code. It’s the first chapter in My Mother was a Computer, and I at least want to reread that).
    I tend to think of neoliberalism as an apparatus (dispositif) of biopower, that takes homo economicus as the truth function of being human. I put biopolitics on top because i think the occlusion of juridical power is important. I also think biopower is about statistics as much or more than its object in ‘life’ – and so the intensification of data analysis becomes one of the ways to distinguish old liberalism from neo. I’m not sure I want to say that rule-following is exactly what’s going on here, mainly because I’m not sure whether to characterize acting upon risk assessments as rule-following.
    That said, yaour comparison with Kant is really interesting. See what you think of this: suppose we say that I’m objecting to what amounts to a transcendental illusion – that there’s something metaphysical behind our representations of data, that those representations get us to the things themselves. So critique needs to identify that when it happens, in order to remind everyone that there’s nothing to be seen (Wendy Chun suggests that there’s a paranoid logic in the demand to have ever more ways of representing things on hand: once you discover truths with a microscope that you couldn’t see without it, you pretty much have to assume that whatever you’re using to represent data is leaving something hidden. I think she’s probably right). But of course what’s interesting about Kant here is that although the metaphysical use of Reason is bogus, it turns out to be not only legitimate but necessary in ethics, on the ‘it can’t all be meaningless’ theory of morality. I remember reading a couple of papers to the effect that analytics is a similar ideological commitment to the ultimate meaningfulness of the world and human action within it. So that’s the onto-theology view, and then there’s the politics part, where one of the main jobs is to critique the onto-theology.
    A couple of final thoughts for now. One is the way that neoliberalism deals with non-economically-rational behavior. So behavioral economics becomes an interesting point of study, since it basically arrives to explain why we can’t do what the theory says we should do (Kant would call that anthropology, and it serves the same purpose: save the model of rationality by modeling how we don’t live up to its standards. The other thought is that big data becomes the ultimate test of this transcendental illusion: for its advocates, it’s the ultimate proof that there is meaning in the universe, if only we could get enough data to prove it.
    So yes, we may be on exactly the same page, or at least close to it. I think sk is right that there’s an “as if” standard here in the better economists, and they’re relatively honest about their various reductions. It’s the same move that settles on Kaldor-Hick’s efficiency: the transaction would be efficient if there were no transaction costs, but there are, so it’s only efficient in a hypothetical transaction-cost-free model.
    If you wrote back a one liner that said “all you’re doing is saying that metaphysical pronouncements are political,” you’d probably be right about that…

    Like

  5. Leigh M. Johnson Avatar

    Thanks Gordon, sk and Robin. This comment thread has been very helpful.
    I suppose I’d say, first, that I agree with Gordon’s claim that neoliberal ontology is “thin”… and, secondly, that I think Robin stipulated as much in her (‘as if’) elaboration above. So, I think the difference between Robin and Gordon is one of emphasis. The more important point in Robin’s original piece was to point out how epistemology and ontology are conflated in/by/for neoliberalism, and her emphasis there was really on what the epistemology looks like and how it operates in this new neoliberal “epistemontology.” It was my piece that really pushed the ontology angle and, in retrospect, I perhaps erred on the side of taking neoliberal ontology (such that it is) at face-value rather than pointing out its deeply problematic reductions.
    I don’t know whether or not neoliberals really believe that the market is ontologically primary—or whether, as I put it in my essay, “the market is all that is the case”—but they certainly act, operate and analyze ‘as if’ they believed it. It may not matter much, in the end, how resolute or authentic that belief is… but this thread has inclined me to think otherwise. Gordon is right to remind us that “all metaphysical propositions are political” and it is the business of politics (and political theory, I think) to critique onto-theology. I suppose that is the sort of critique I thought I was doing in my piece but, again, I can now see how including this ‘as if’ caveat would have been of tremendous value.
    At any rate, thanks again for this discussion.
    One last thing for Gordon: I’d be interested in hearing more about your objection to the use of “algorithm.” I’m neither a mathematician, a coder nor an economist, so be gentle!

    Like

  6. Andrew Avatar
    Andrew

    Thank you for this discussion! I just want to pick up on the ontology question and ya’lls very smart thoughts here, another way to approach the question is through the contrast between Foucault’s study of the “American” neoliberals and their german counter-parts: the ordo-liberals. These folks, the sometimes-called Freiburg School, who more upfront about the necessity of not taking the market for granted, but rather securing the existence of the market. That is, I would argue that for the ordo-liberals, there is no ontology to the market, but it is a concretely political project (and in direct opposition to the Frankfurt school readings of materialism). Then, to Becker/Schultz/Friedman (who I think Foucault rightly focuses his attention as the key figures here): this is where I would say (drawing on the pieces that SK cites above) that the USian neoliberal project as described by them has two moves: 1) positing the market ontologically and politically. That is, the market must be brought into existence as the grounds of being, of human reason, and it must be done so through force (i.e. through the state’s monopoly on the use of violence). Hence, the first two domains in which human capital theory is developed are agricultural policy (Schultz) and crime policy (Becker).
    2) The second move is epistemological and political. Here, the market is taken as the arbiter of human reason (what Foucault calls a “grid of intelligibility”) to be enforced through a self-conscious move to extend the power of this rationality to all domains of human behavior (i.e. no more anthropology, no more political science, no more sociology, there is only economics, because there is only one mode of subjectivity that matters for governance as both dominance and self-formation: homo œconimicus). To that end, I think that there is an ontology to the market for neoliberals, but not in the sense that the term is normally used. It’s a fabricated ontology based on political claims about human reason and rationality. The two moves aren’t necessarily chronological or logical, but they might be distinguishable from each other. So, I think that the “as if” matters a great deal as I argue in my piece on Becker and Foucauldian ethics–http://rauli.cbs.dk/index.php/foucault-studies/article/view/3338–because it’s the final move of disavowal, where the bad faith shows itself, and the ethical content is ultimately evacuated from neoliberal modes of governmentality.
    So, this is why I would concur with the “all metaphysical pronouncements are political” line, and why I would say (again, following Foucault and because I am a political theorist, after all), we are always already in need of some power in our analysis of ontology and epistemology. If this is what we’re getting out by talking about how epistemontology works/functions/acts/performs, then I’m on board.
    ya’ll rock.

    Like

  7. Gordon Avatar

    Hi Andrew, that’s really helpful, and I (at least) don’t pay enough attention to the ordo- to neo-liberal transition. My insistence on the artificiality of markets for neoliberal self-conceptions may be an artifact of my own reading in intellectual property theory. There – and I’m reading Chicago law and not Chicago economics, mostly – the central question is one of remedying a market failure, since the marginal price to produce an immaterial good is nearly zero. So the neoliberal accounting of intellectual property always begins with the need to use artificial monopolies to create a market, thereby providing “incentives” to produce such goods.
    Another question is about differences within the neoliberal camp. Hence, for example, there was an early debate on IP between Kenneth Arrow (who thought the way to get IP wasn’t property regimes, because of deadweight loss) and Demsetz (who said the efficiency gain from the price mechanism was more significant). There’s also what I’d call a left neoliberal line of IP theory, too, which says that too much IP is inefficient (a lot of it centers around transaction costs, and makes variations of the “tragedy of the anticommons” argument or of the Calabresi/Melamed line on liability rules). There’s also a gap between the academics and the policymakers: with only a little prodding, almost all the neoliberal economists signed a brief opposing the most recent round of copyright extension. I don’t know what to do with this level of detail yet…
    I think it matters what “they” really believe, ontologically, though, because it’s easier to convince somebody that markets need tweaking if they don’t think they’re a part of nature. So the sort of immanent critique present in the anticommons argument, for example, is a lot easier to mount if one insists that an IP market is artificial. So, too, is it (more) possible to be persuaded of the viability of other commons management strategies.
    Leigh: the hesitation about algorithms may show my age more than anything – all the programming I did was before object-oriented languages were really popular, and so my model for an algorithm involves a fairly formal logical structure, and is instruction-based (put data here, if data is x, do y, etc). So the idea that the machine is going to generate its own search queries (etc) is one that I intuitively want to assign a different concept to. As I said, I need to think a lot more about the subject, however.

    Like

  8. Andrew Avatar
    Andrew

    Gordon – re: the ordo-liberals, I think the best work on them has been done by Thomas Biebricher (there was a really good piece of his on ordo-liberalism in foucault studies a few years ago).
    To be clear, I think you’re not at all wrong to insist on the artificiality of markets, and the deep connections between Chicago law and econ can’t be overstated (Posner, in particular, makes the connection tight) as there were a whole generation of folks trained there between the the law school and the econ department. In particular, the Law and Economics workshop (the workshop being the staple of all academic life at Uchicago) was and continues to be a very big deal.
    I’m really interested in the IP lines here, especially where it intersects with my pet chicago-school interest: the economics of crime literature. So, piracy seems to be the place where these things might link up, and in Posner’s now infamous line that crime is nothing more than a market failure, a circumvention of the pricing regime that generates negative externalities.

    Like

Leave a reply to Andrew Cancel reply