This talk was prepared for the event “The Future is Now” hosted by Espace Ladda in Antwerp, Belgium in October 2010 and later reworked into a text.
When exactly does something become “meaningful”? Running slightly ahead of myself I want to propose that it happens instantly and that for something to continue producing the meaning it should either constantly include the periphery, rewire itself on a regular basis, or form fractal structures aligning with other “meaningful” formations.
This is a randomly generated network where probability of any two of the 26 dispersed nodes being connected p = 0. We can also call it “foam” or “connected isolations” . The nodes might have a meaning in themselves, but as the whole the structure does not appear to be meaningful yet.
Derrida mentions a “haunted city”, while saying that “the relief and design of structures appears more clearly when content, which is the living energy of meaning, is neutralized.” 
Let’s start to randomly walk through the city, from one node to another (from one word to another, from one person to another, etc) in an attempt to find meaning. We could also set a specific goal or a path, but then we would be biased by the meaning of the word “meaning”. So we try the worst-case scenario: random wandering through the landscape, like Tarkovsky’s Stalker. 
“In the beginning God created the heavens and the earth”. 
This is the first step: we see the formation of oppositions and dichotomies. In terms of graph theory we are witnessing a random graph with the probability of any two random nodes to be connected p = 0.01.
Just because we’ll talk about probability again: p = the current number of links divided by the total number of possible links. The total number of possible links for a network with n nodes equals n * (n – 1) / 2 = 26 * 25 / 2 = 325. This is a formula from combinatorics, and it will be the only formula I will use here. If p = 0.01 then the current number of links is approximately 0.01 * 325 ~ 3.
At p = 0.05 (16 random steps) we have more complex triangular and sequential motifs forming within the network:
We could interpret them as the emergent narrative structures (blue formations) and feedback loops (orange formation), but then we would be trying to make sense. In any case, the network does not yet present a meaningful formation as a whole, instead we are dealing with dispersed islands. However, there are differences within these islands: some nodes have more connections and some have less. We make the ones that have more connections look more powerful.
As we move on through our “haunted city” and make more random steps, the network undergoes through a phase transition when most of the nodes within the network become connected within one single structure. In terms of network theory, so-called “giant component” appears . According to Erdos and Renyi , this sudden transition from disjointed motifs to the giant component in a random network happens when the average number of connections reaches the number of nodes . In our case that would be at the point p = 26 / 325 = 0.08. When p = 0.10 (equals 32 successive iterations in our case) this kind of formation emerges:
Following the perceptual organizing principles of gestalt , as soon as most of the nodes belong to the same component, we see it as a simplified whole rather than disparate parts. The network communicates meaning as a whole and if we did not walk randomly it could have happened much earlier.
In order to make more sense I could also say that if you do something long enough it will finally appear to be meaningful. Like when you read a long novel. Or when you constantly meet people and suddenly find yourself in a community where everyone knows each other. Or when you realize that things always happen for a reason as you get older.
That is an interesting point: what happens when something already has a meaning, but we continue to search for it? According to Erdos and Renyi  , the threshold probability at which subgraphs of 4 fully interconnected nodes emerge within random network is p ~ 1/N^(-2/3) ~ 0.12. Therefore for p = 0.15 (or 48 random iterations) there is a high probability that the network contains subgraphs (or motifs) of 4 fully interconnected nodes. These motifs indicate the emergence of an informational network .
For p = 0.25 (or 82 iterations) the network contains subgraphs of 5 fully interconnected nodes , , , the complexity increases, the clusters of nodes become more interconnected and it’s harder to distinguish one “community” from another.
We could also think of our random walk as shuffling a deck of cards where each time we take a card from the top of the deck and put it back in a random position. Aldous and Diaconis (who used to be a keen poker player turned mathematician) showed that one can reach the point where each card within the deck has an equal chance to appear at the top. This is called discrete uniform distribution and in order to reach it we need to perform t = n * log n iterations , which is t = 26 * log 26 ~ 85 steps in our random walk. We could say that starting from this point every next step has an equal probability, producing the “difference without a concept, repetition which escapes indefinitely continued conceptual difference.” 
Indeed, when the probability of any two randomly chosen nodes to be connected p = 0.5 (which happens after 163 random iterations) all the nodes in the network are more or less equally interconnected and there are hardly any clusters, every node reaches every other node very quickly, the entropy increases, the differences subside.
It makes sense: we visited all the sights in our “haunted city” so many times that they all look more and more the same. The city has a meaning as a whole, but everything and everyone inside is almost equal, we might even start to feel a bit bored at this point.
If we continued connecting the dots, we’d reach the point (p = 1) where the system reaches equilibrium, every node is connected to the other, the power is equally distributed, and the structure solidifies to the point where each consecutive step does not produce any more difference. Information is measured by the amount of entropy it decreases . At this point every new step we make will not decrease entropy and will not produce meaning.
The structure starts to produce its own meaning, acting as an amalgamated whole where the differences between the nodes do not exist anymore. In order to evolve and continue producing meaning it has several choices:
1. Start a reverse process and remove the already existing connections (or some of the nodes), in order to introduce difference back into the network; This brings us to a predator-prey model, which is one of the main mechanisms to maintain non-equilibrium stability employed in nature .
2. Integrate nodes from the periphery, in order to tip the equilibrium point and continue the evolution;
3. Treat the resulting network as a node in itself and start operating on a meta-level, building new connections with other node-networks, creating a fractal-like structure;
Meaning itself has to do with a complex web of relations presenting a certain interconnected structure that we can perceive, understand, or recognize. That’s why context is so important. However, it is not always enough to see a structure in order to make sense out of it. Meaning is beautiful especially for the reason it doesn’t always make sense. Meaning has to do with patterns, order, and structure. It’s almost in juxtaposition to the natural tendency of time towards entropy, disorder, decay, and death. Meaning, then, could be a sign of something alive.
 Peter Sloterdijk, Sphären III – Schäume (Suhrkamp, 2004)
 Jacques Derrida, Writing and Difference (Routlege, 1978)
 Andrey Tarkovsky, Stalker (Mosfilm, 1979)
 Genesis 1:1
 Solomonoff and Rapoport, Connectivity of Random Nets (Bulletin of Mathematical Biophysics, 13, 1951)
 Erdos and Renyi, On the Evolution of Random Graphs (1960)
 Newman, Barabasi, Watts, The Structure and Dynamics of Networks (Princeton University Press, 2006)
 Max Wertheimer, Gestalt Theory (New York: Gestalt Journal Press, 1997)
 Milo et al, Network Motifs: Simple Building Blocks of Complex Networks (Science, 298, 2002)
 Aldous and Diaconis, Shuffling Cards and Stopping Times (The American Mathematical Monthly, 93, 1986)
 Gilles Deleuze, Difference and Repetition (New York: Continuum, 2009)
 Seth Lloyd, Use of Mutual Information to Decrease Entropy (Physical Review, 39-10, 1989)
 Alexander D. Bazykin, Non-Linear Dynamics of Interacting Populations (World Scientific Publishing, 1998)
Last image by Justin Palermo, “Covering is Revealing”, 2009.