Author Archives:

Self-organizing Patterns: Local Activation, Long-range Inhibition

These automata follow a simple rule: Try to be the same color as those near me, and the opposite color of those a little further away. Slight biases vary background/foreground relations, and initial conditions determine other aspects of the transient and stable patterns (e.g. the random initial condition in Video 3 vs the approximately symmetrical ring set as the initial condition in Video 4).


Video 1


Video 2


Video 3


Video 4



Climate Models and Precautionary Measures

Forthcoming in Issues in Science And Technology Summer 2015

Joseph Norman, Rupert Read, Yaneer Bar-Yam, and Nassim Nicholas Taleb

The policy debate with respect to anthropogenic climate-change typically revolves around the accuracy of models. Those who contend that models make accurate predictions argue for specific policies to stem the foreseen damaging effects; those who doubt their accuracy cite a lack of reliable evidence of harm to warrant policy action.

These two alternatives are not exhaustive. One can sidestep the “skepticism” of those who question existing climate-models, by framing risk in the most straight- forward possible terms, at the global scale. That is, we should ask “what would the correct policy be if we had no reliable models?”

We have only one planet. This fact radically constrains the kinds of risks that are appropriate to take at a large scale. Even a risk with a very low probability becomes unacceptable when it affects all of us – there is no reversing mistakes of that magnitude.

Without any precise models, we can still reason that polluting or altering our environment significantly could put us in uncharted territory, with no statistical track- record and potentially large consequences. It is at the core of both scientific decision making and ancestral wisdom to take seriously absence of evidence when the consequences of an action can be large. And it is standard textbook decision theory that a policy should depend at least as much on uncertainty concerning the adverse consequences as it does on the known effects.

Further, it has been shown that in any system fraught with opacity, harm is in the dose rather than in the na- ture of the offending substance: it increases nonlinearly to the quantities at stake. Everything fragile has such property. While some amount of pollution is inevitable, high quantities of any pollutant put us at a rapidly increasing risk of destabilizing the climate, a system that is integral to the biosphere. Ergo, we should build down CO2 emissions, even regardless of what climate-models tell us.

This leads to the following asymmetry in climate policy. The scale of the effect must be demonstrated to be large enough to have impact. Once this is shown, and it has been, the burden of proof of absence of harm is on those who would deny it.

It is the degree of opacity and uncertainty in a system, as well as asymmetry in effect, rather than specific model predictions, that should drive the precautionary measures. Push a complex system too far and it will not come back. The popular belief that uncertainty undermines the case for taking seriously the ’climate crisis’ that scientists tell us we face is the opposite of the truth. Properly understood, as driving the case for precaution, uncertainty radically underscores that case, and may even constitute it.

Student’s T Random Walks

A few 2D random walks with magnitudes drawn from the Student’s T distribution. The distributions become progressively more fat-tailed further down the page. Graphs can zoom and pan. Zooming is really instructive with respect to the fat-tailed dynamics. Much of the micro detail is lost at the scale of the largest jumps — zooming in reveals just how large the rare jumps are relative to the ‘typical’ ones.

Antifragile Random Walks

A million timesteps with a Pareto distribution with $latex \alpha =1 $ and mode shifted down to $latex -11 $ from $latex 1 $. Notice how for most time steps, the walk moves downward. However, the rarer upticks are large, orders of magnitude larger than downward movements.

In [1]:
T = 1000000
X = np.zeros(T)

for i in range(5):
    for t in range(T-1):
        X[t+1] = X[t] + np.random.pareto(1) - 12


Cauchy Random Walks, 2D and 3D

1 million steps, with step size determined by a Cauchy distribution, and angle(s) by a flat distribution.

In [1]:
T = 1000000
X = np.zeros((T,2))


for t in range(T-1):
    stepSize = np.random.standard_cauchy()
    direction = np.random.rand()*2*math.pi
    xStep, yStep = cos(direction)*stepSize, sin(direction)*stepSize
    X[t+1,0] = X[t,0] + xStep
    X[t+1,1] = X[t,1] + yStep
In [2]:
X = np.zeros((T,3))

for t in range(T-1):
    stepSize = np.random.standard_cauchy()
    direction1 = np.random.rand()*2*math.pi
    direction2 = np.random.rand()*2*math.pi

    xStep = cos(direction2)*cos(direction1)*stepSize
    yStep = sin(direction1)*stepSize
    zStep = sin(direction2)*cos(direction1)*stepSize
    X[t+1,0] = X[t,0] + xStep
    X[t+1,1] = X[t,1] + yStep
    X[t+1,2] = X[t,2] + zStep
from mpl_toolkits.mplot3d import Axes3D
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')  
frame1 = plt.gca()