In difference to Heidegger the understanding of language that Chomsky details in his seminal book “Syntactic Structures” (2002) shows that his idea of a good theory and research is structured by the essentially aesthetic idea of “the simplicity of the whole system” (Chomsky 2002, 56). This principle of simplicity of a whole accordingly should organize the idea of finding a “metalanguage to the language in which grammars are written – a metametalanguage to any language for which a grammar is constructed” (Chomsky 2002, 54). Obviously, the idea of simplicity acts here as a counterforce to the looming thread of infinite regress that encourages to think of metametalanguages. However, it also shows that the implicit idea of determination prominent in Chomsky’s thinking is one of a closed system, ruled by an internally absolute principle. This idea of a consistent system that is organized by a small batch of internal rules isn’t surprising and within bounds, especially in view of machines it is the most viable way to approach. Chomsky accordingly pictures the goal of a theory of language in terms of a machine as something that must provide a “practical and mechanical method for actually constructing the grammar” (Chomsky 2002, 50–51). This machine as we can gather in the Chomsky, Roberts, and Watumull commentary is “the innate, genetically installed “operating system” that endows humans with the capacity to generate complex sentences and long trains of thought” (Chomsky, Roberts, and Watumull 2023). Now, for normal machines, like any computer one can pick up, this idea of a system works splendid, within some limitations however, as it turned out to be necessary in some instances to think of ways how computers can work around data and inputs which to do not conform to this ideal, paraconsistent logic and fuzzy logic come to mind.
A fundamentally different perspective arises when one assumes that, since machines can be constructed in a particular way, spoken language (and, more broadly, reality) must also conform to this design. While it might gain some credibility by being deemed rigorous, it ultimately represents an insidious infiltration of theology into the realm of science by assuming an axiomatic primacy of the system. It is worth noting that in his 2006 preface to "In Contradiction," Graham Priest effectively illustrates the extent to which ideas surrounding the necessity of consistency and the system continue to permeate the analytic discourse on logic (Priest 2006, XVIII). In contrast, the Continental discourse on logic is predicated upon the dismantling of this onto-theology. Prominent thinkers such as Martin Heidegger, Jacques Lacan, Alain Badiou, and more recently, Quentin Meillassoux, have critiqued the notion of a consistent reality that can be represented by a systematic, coherent, and ideally aesthetically simple theory. They argue, and we hold this position too, that this concept is nothing more than a truncated and filtered form of religious thought, or in Heidegger's terms, plain old metaphysics. It is important to recognize that while individuals may hold various religious beliefs about the world, these beliefs do not qualify as the basis of scientific methodologies, be they formal or empirical.
This fundamentally changes how language is understood, especially in a systematic context. To demarcate the problem at hand it is necessary to show where the continental discourse on language radically differs from the one in which Chomsky’s ideas originate. In stark difference with the philosophy of Bertrand Russel (1905) as marked in his theory of descriptions, where the denotions indicating nothing are essentially false if taken as a primary occurrence, the approach that we can find in Heidegger, Lacan and the continental discourse operates as an inversion of this idea, as Jaques-Alain Miller (2002) marked. For Lacan, Heidegger and other central representatives of the continental discourse, only those denotions oriented on the radical indeterminate can be considered constituents of truth. While this might seem strange for those unaware of this discourse, it is not as distanced from classical analytic epistemics as it might seem. Karl Popper (1935) with his concept of falsifications as the only true access to the reality exterior to a theorical system approaches a comparable thought. However, what considerable parts the continental discourse focused on, was the interlinked problem of formal reasoning and the “impasse of formalization” (Badiou 2006, 5). Hans Blumenberg (2010) demonstrated early how deeply this problem is ingrained in classical philosophy, where systems are essentially oriented around an “absolute metaphor” an empty denotion that holds up the theory, instead of harming it.
The reasons for this are complex, but it makes Russell’s solution to denotion look like something of a mirage, words or sentences have no permanence at all in their link to the real, as Hegel already demonstrated (Hegel 1988, 76), so how can we build meaning on this link? One solution is a certain pragmatism about this link coupled with an understanding of logic that eschews the problem further, if one assumes that logic and empirical reality are essentially separate, we don’t need to think about this as it makes no sense, and this pragmatic gap is where the old god of the philosophers under new names usually creeps back in. The early Wittgenstein’s mysticism, while seemingly moving into the direction of continental thinkers essentially relegates this problem to the theologians. However, if such strong limitations are themselves nothing but metaphors constructed to appear rigorous and strict where theory actually gets positively weird, we approach a different problem. Slavoj Žižek formulated this as our capacity of formal thought reaching through to the baselessness of reality (the void), beyond what the theistic version of a basal reality would assume (Žižek 2012, 726). Comparably, Quentin Meillassoux (2008) marked this void or chaos that constitutes nothing but the absence of a basic reality as the absolute foundation of mathematical reasoning as it was introduced in the introduction of this paper.
This ontological inversion of denotions leads to a different understanding of the foundational elements of language. No longer is the predication the central element of meaningful language, but as Heidegger marks it, a pre-predicative negational element of language comes to the fore: a break or gab in the consistent structure of the sentence. To approach this gap a more complex approach to negation became necessary. Because this element of a sentence or system that links it to the void by indicating an indeterminate excess, still holds up the systematic structure of the sentence or system. What enters here are different forms of negation, that expand from the classical privation and have been formalized in psychoanalysis as frustration and castration, next to the classical privation. I will shortly distinguish these relations:
Privation is the classical form of negation, marked first by Aristoteles. Privation is a lack of something real, which is then marked by a symbolic object (-a). This means that a privation as a form of negation already requires a formal system of order, since a purely descriptive or sensual perspective has no access to the concept of a lack. This is of central importance since the sensual or imaginary, as Lacan calls it, relation to objects is structured by its absence of negation and therefore assumes a wholeness or gestalt of its objects. For example: the missing bike is only missing because it should be at its place. It is replaced by a negative (missing) object, that can be addressed, whereas the sensual doesn’t see the lack without a symbolic support. Privation as negation always indicates the negation of something and therefore “serves to express a negated existential proposition” (Carnap, 2004: 96). In this sense, the logical negation allows us set up a symbolic object (-a) which allows us to ‘see’ a lack. Which marks a more complex problem than pure absence. This is accessible to computers, as they are able to mark determinate absences and this positivizing of the lack is central in Language Models as negations are still only links between positive tokens (Gubelmann and Handschuh 2022), however they had some difficulties at the beginning (Ettinger 2020; Kassner and Schütze 2020). However, since the imaginary is deeply involved in our thinking processes it also creates a distinct empirical negation that only comes to the fore if we assume that the imaginary is the usual starting point for our thinking. The imaginary is therefore the inclusion of the systematic as a reformulation of the classical falsum. An argument first made explicitly by Heidegger, where he marks that instead of a dissonance between presented and represented, which would rely on a strong link between signified and signifier as a basis of truth, he introduces the idea of full and timeless consistency of the representation irrespective of the object in question as a new type of falsum in his commentary on Nietzsche (Heidegger 1996, 347).
The specific relation to this falsum as a starting point of thought is what Lacan calls “frustration” as the failure of a representational relation, i.e., something is imagined in a certain way as fitting its represented object, but the real object doesn’t fit these imagined assumptions emerging from the partiality of these objects. In this sense most object are partial objects, since they never approach the consistency that the lack of lack of the imaginary relation implies. Frustration is therefore the dissonance between the representation and the represented. However, this negativity is not a privation, as the representation doesn’t include a specific negated element of the presented but marks the existence of an indeterminate unknown as the dissonance between representation and presentation. The negativity marked is therefore not something determinate and negated, but something unknown or unexpected. As Popper in his theory of falsifications noted (Popper, 1935), this frustration, despite being a failure, is related to the real object of science, which shows itself by resisting our assumptions. In epistemic terms this negation is a strong relation to the real, because while we cannot fully prove any empirical theory, we can disprove it consistently through frustration. This relation is gaining increasing importance in machine learning, and we see it in various use in different sciences. However, its origin is according to Lacan, not simply an expectation, but the implication of a wholeness or systematic structure.
Lastly, Castration, as identified by Freud and Lacan, marks the purely formal negativity that is introduced by any determination. Any formal (symbolic) determination or propositions produces a determinate inside and an indeterminate outside as Heidegger detailed in “What is Metaphysics?”. This can be exemplified by marking something on a blackboard. The marked space as determined by the chalk outline determines the inside, while the indeterminate exteriority is a necessary element of the determination itself, it is necessarily indeterminate itself. Although we can now create a bigger determinate field on the blackboard, which includes the first determination, we again rely on an indeterminate outside. This radically indeterminate negativity only appears, however, if we give up on the absolute in Spinozean terms. Classically, the formal problem of castration could be ignored (In Newtonian physics and 19th century science for example) by introducing a more or less explicit theological concept of the absolute, a final infinite and fully self-determined unity called ‘God’. This reduces the necessary indeterminate to an epistemic indeterminate – into something we just don’t know yet – instead of something that is a necessary element of any formal structure. While the theological argument is today weakened, a variant of it still exists by conflating the imaginary and the symbolic into an imaginarized symbolic, which operates without this excess, the same effect can be reached. This has been discussed prominently by Heidegger (Heidegger, 1999: 82–96) and later by Badiou (Badiou, 2006), and we identified it as a problem in the general conceptualization of AI (Heimann and Hübener 2023a) already.
What is important here is that the structural elements of frustration and castration work together to constitute the access to the gap that, for example as an absolute metaphor in the Blumenbergian sense, structures the consistency of a system. Closer analysis of such a metaphor of course always reveals the actual inconsistency of the system marking it a symbolic object that represents nothing, a form of privation that marks this gap as such. This is missing in current LLM models, as negation is currently represented by positive weights (Morante and Blanco 2021). What Lacan’s approach to logic therefore includes as Alain Badiou (2006, 5) notes, is “the clear Lacanian doctrine” that “the real is the impasse of formalization”. This empirical break in the formalism upheld by the break itself is what the Lacanian tradition thinks under the header of castration and its relation to consistency, systems and the law has been the object of extensive scrutiny (see for example Copjec 1994; Ragland-Sullivan 2015; Zupančič 2017).