“Explaining Multiple Forms of Conscious Awareness, and More,” Dr. Frank Heile, Dec. 6th, 4:30-6pm, Alway M114.

November 26th, 2018

Explaining Multiple Forms of Conscious Awareness, and More,” Frank Heile, Ph.D. (Physics, Stanford)

If you would like to attend dinner with the speaker after the seminar, please contact StanfordComplexity (at) gmail.com !!!

This talk  presents a materialistic model of consciousness that can explain the distinctions between several different forms of conscious awareness:

  • Non-human animal consciousness
  • Modern human consciousness
  • Flow state consciousness
  • “Enlightened” states of consciousness (also known as “nonduality”)
  • Consciousness experienced in Auto-Activation Deficit syndrome
  • The phenomenal consciousness vs. access-consciousness distinction proposed by NYU philosopher Ned Block
  • How spirituality developed and how its invention changed consciousness

A three-agent model of the brain, combined with the Attention Schema Theory model of awareness proposed by Princeton neuroscientist Michael Graziano, provides a framework to explain these distinctions.

Finally, this model can offer answers to the following philosophical questions:

  • Why does conscious awareness seem to be fundamentally non-physical and to not have a location in space?
  • What conditions cause conscious awareness to arise?
  • Can conscious awareness exist without a “self?”
  • Do humans have free will?

There is bound to be disagreement with the proposed answers to these questions, but these concrete answers provide a starting point for thought-provoking discussions! For more details about the proposed model, please reference an additional description here: http://complexity.stanford.edu/blog/seminar-by-dr-frank-heile-on-dec-6th-430-6pm-in-alway-m114.

“Translingual Dynamics”

November 14th, 2018

Thursday, November 29, 6:30pm-9pm.

Room: Z301 (Stanford Business School)

Link to gCal invite

Seminar by Dr. Frank Heile on Dec. 6th, 4:30-6pm in Alway M114.

October 30th, 2018

Stanford Complexity Group will host a talk by Dr. Frank Heile on Dec. 6th, from 4:30-6pm in Alway M114 (Stanford Medical School). Details of the talk follow:

Explaining Multiple Forms of Conscious Awareness, and More,” Frank Heile, Ph.D. (Physics, Stanford)

Abstract: An agent, such as a human being, is an entity that can sense the world and can act on the world, often in the pursuit of goals. Decomposing a complex agent into multiple sub-agents is one strategy for gaining insight into underlying mechanisms. The high-level functional model proposed here,decomposes the brain into three interconnected sub-agents: The Thinker, Doer,and Experiencer. The Thinker and Doer are justified because of their consistency with well-established, experimentally-derived theories of cognition in both psychology (Dual Process Theory [1]) and neuroscience (the Action-Outcome/Stimulus-Response model [2]). A theorem in control theory [3] proposes that an effective agent should contain a model of the world where the agent operates. This theorem suggests the existence of the third agent, the Experiencer; this agent would construct the model of the world that is shared by the Thinker and Doer.

Attention Schema Theory [4] proposes a model of awareness comprised of three objects: the agent’s self-model, the agent’s attention schema (which is a model of the neurological attention mechanism), and the representation of the attended object. This awareness model is applied to each of the three proposed sub-agents to describe the agent’s forms of awareness about external objects, and the types of self-awareness each agent would experience. The result is three different forms of conscious awareness. One of these forms would be the consciousness that non-human animals (and ancient humans) would experience. The second is the default awareness of modern humans. The final form of awareness corresponds to some of the experiences that occur in transient “flow states” or the more persistent “enlightened consciousness states.” In addition, this three-agent model clarifies the distinction, proposed by philosopher Ned Block, between phenomenal consciousness and access consciousness [5].

This three-agent proposal can also explain the rare neurological syndrome of Auto-Activation Deficit [6]—a specific type of apathy where patients can sit for hours, not moving or talking. Surprisingly, this inertia is immediately reversed when the patient is asked to perform some activity or to answer a question. Additionally, many of these patients have blunted affect and report that they do not experience thoughts. This three-agent model interprets all of these symptoms as evidence that these patients are suffering from a disabled Thinker.

Finally, this model explains the reasons for the development of both theistic and non-theistic spiritual traditions, and the efficacy of spiritual practices. Attend this session to explore a novel approach to assessing scientific or philosophical theories of consciousness, and the questions of agency and free will.


[1] Kahneman, D., 2011. Thinking Fast and Slow. New York: Farrar, Straus and Giroux.

Evans, J. S. B. T. & Frankish, K., 2009. In Two Minds, Dual Processes and Beyond. Oxford, UK, Oxford University Press.

[2] Yin, H. H. & Knowlton, B. J., 2006. The role of the basal ganglia in habit formation. Nature Reviews Neuroscience, Volume 7, pp. 464-476.

[3] Conant, R. C. & Ashby, W. R., 1970. Every Good Regulator of a System Must Be a Model of That System. Int. J. Systems Sci.,, 1(2), pp. 89-97.

[4] Graziano, M. S. A. & Webb, T. W., 2015. The attention schema theory: a mechanistic account of subjective awareness. Front. Psych., 6(500).

[5] Block, N., 1996. How can we find the neural correlate of consciousness?. Trends in Neurosciences, 19(11), pp.456-459.

 [6] Habib, M., 2004. Athymhormia and Disorders of Motivation in Basal Ganglia Disease. The Journal of Neuropsychiatry and Clinical Neurosciences, 16(4), pp. 509-524.

      Laplane, D. & Dubois, B., 2001. Auto-Activation Deficit: A Basal Ganglia Related Syndrome. Movement Disorders, 16(5), pp. 810-814.

“DAO Democracy” Seminar, Ralph C. Merkle – Oct. 18, 2018.

October 7th, 2018

“DAO Democracy”, Ralph C. Merkle
October 18st @ 4-5:30pm.
Venue: LKSC 120 (Medical School)

Event Video:


Dr. Ralph Merkle is a Senior Research Fellow at the Institute for Molecular Manufacturing, and a true pioneer in various areas of engineering and computer science. He will be giving a talk about the potential uses of new decentralized digital technologies to improve our democratic systems.

Read Dr. Merkle’s recent publication on DAO Democracy here (page 28).

All are welcome to attend this exciting lecture, it will be in a large lecture hall. 

Abstract: In a democracy, ordinary citizens decide complex, fateful issues by voting. Recent history suggests this process is less than optimal. Analysis of voting usually concludes it provides negligible economic value to the voter. Voter turnout is therefore highly dependent on emotional factors (“rallies”, “peer pressure” and the like). Voters are often ignorant of basic facts, and are subjected to sophisticated misinformation campaigns. Half of voters are below average. Elected officials have been known to ignore their promises once in office, and the mechanisms of government are not always transparent in their operation. This combination of weaknesses makes current democracies grossly inefficient at best, and prone to catastrophic failures at worst. A combination of ideas that includes prediction markets, Decentralized Autonomous Organizations (DAOs), ideas from the wisdom of crowds and futarchy, can be combined into what might be called a DAO Democracy, a form of government that appears to solve most of these ills.

SCG Seminar on 10/1/2018 – Dr. Toby Lowe – Complexity & Public Management

August 27th, 2018

SCG presents a special seminar by Dr Toby Lowe (Newcastle University Business School, Open Lab).

*** Watch the video of this talk at: https://www.youtube.com/watch?v=ZS97cAizcYk

TItle: “Why trust helps us to manage better in complex environments:exploring a complexity-informed Public Management paradigm”

Date, Time, Venue: 10/1/2018 @ 4pm-5:30pm in Alway M106 (Medical School). 


Currently, public management (sometimes called public administration) is dominated by a paradigm based on linear notions of change. This paradigm, called “New Public Management”, rests on the idea that work towards the achievement of social goals is best undertaken by setting targets for desired outcomes (like “higher rates of employment amongst a target group” or “fewer recorded crimes in a neighbourhood”), and then managing the performance of those who are tasked with delivering those goals by measuring the amount of progress which has been made, using agreed proxy measures. It seeks to hold people/organisations accountable for achieving desired outcomes.

This paradigm is failing. Rather than create real improvements on the ground, evidence suggests that instead it promotes “gaming” amongst those who are managed in this way – it turns social action into a game which is ‘won’ by producing data which makes it look like your programme is succeeding. In other words, this paradigm turns everyone’s job into the production of good-looking data, rather than addressing complex, real-world social problems.

A new complexity-informed paradigm

Complexity explains why this paradigm fails. The social outcomes we seek (like higher employment rates, or less crime) are not delivered by organisations. They are emergent properties of complex systems. Complexity explains why proxy measures are not effective substitutes as performance feedback mechanisms. And it explains why it is folly to hold people/organisations accountable for producing desired social outcomes, when those outcomes are beyond the control of particular actors in those systems.

Consequently, a new complexity-informed public management paradigm is emerging (https://collaboratecic.com/a-whole-new-world-funding-and-commissioning-in-complexity-12b6bdc2abd8 ) This new paradigm asks: how do we shape and influence the behaviour of complex systems in order to produce desired results?

This new paradigm places trust at the heart of the distribution and performance management of resources for social action. It is based on three key shifts:

    • Recognition of the intrinsic motivation of people who undertake social interventions
    • Using learning (rather than vertical accountability) as the driver of performance improvement
  • Funders/Public bodies taking responsibility for the health of the eco-systems from which positive outcomes emerge

If you would like to join in with others who are exploring this new paradigm beforehand, you can become part of the conversation at: https://khub.net/group/complexity-friendly-system-oriented-commissioning-pilot-project

Cultural fit and Complexity

March 29th, 2018

Interview with Professor Amir Goldberg on cultural fit

What it means to “fit in” and why it matters

Almost everyone has had the experience of what it feels like to try to fit in to a new social environment. Companies consider cultural fit of applicants when making hiring decisions. Cultural fit can even be used as an argument in court: in 2012, Ellen Pao lost her discrimination case against the venture capital firm Kleiner Perkins based in part on an argument that she was denied a promotion not based on her gender but rather that she had a poor cultural fit.

Associate Professor in the Stanford Graduate School of Business Amir Goldberg argues that cultural fit matters for the social group and the individual. Professor Goldberg spoke at the Stanford Complexity Symposium on November 14, 2017, describing his work related to measuring cultural fit. Drawing on previous research on cultural fit from managerial science as well as psychology and sociology, Professor Goldberg outlines the difference between cognitive cultural fit- how one’s private self adheres to the surrounding culture- and behavioral culture fit- how one’s behaviors adhere to the surrounding culture.

His 2017 paper with Sameer B. Srivastava, V. Govind Manian, and Christopher Potts, on which his symposium talk is focused, introduces an innovative method for measuring cultural fit, in which they consider “culture” to emerge from individual interactions. The method used in this paper compared the language use of incoming and outgoing messages between employees at the same company. They measured the similarity of different “lexicographic units”, i.e. the usage of different words as well as punctuation etc.

To illustrate how different cultural norms are reflected in lexicographic unit usage differences, consider these starkly different example emails from executives at Sony and Enron.

Figure from Sameer B. Srivastava, Amir Goldberg, V. Govind Manian, Christopher Potts (2017) Enculturation Trajectories: Language, Cultural Adaptation, and Individual Outcomes in Organizations. Management Science. For details on the standardization of the timeline, see their paper below

Using this method, their work demonstrates how, in a corporate setting, an individual’s cultural fit can change over time. They found that a person’s cultural fit when they joined an organization didn’t matter as much for their success at the company as their cultural fit trajectory over time. Those who were let go tended to decrease their cultural fit over time. Those who stayed tended to increase their cultural fit. A third group, who quit, increased then decreased their cultural fit.

Figure from Sameer B. Srivastava, Amir Goldberg, V. Govind Manian, Christopher Potts (2017) Enculturation Trajectories: Language, Cultural Adaptation, and Individual Outcomes in Organizations. Management Science. Sony and Enron emails are referenced from publicly available archives.

Professor Goldberg’s talk sparked an interesting discussion at our symposium about how culture is a complex system, how this insight helps us to better understand culture, and what the implications of this work could be. Stanford Complexity Group (SCG) sat down with Professor Goldberg to learn more.


[Interview has been edited for clarity and length]

SCG: You talk about the cultural norms as an emergent process from these individual interactions. Could you speak on how that might be different in terms of top-down cultural norms? For example, people say that at Amazon they believe in frugality as an important value, which could be like a cultural norm.

Prof. Goldberg: First, you want to ask yourself: where does culture exist? Who defines culture- is it CEOs? Priests? Soviet idealogues? Or is culture actually what happens on the ground? The way that culture matters is how the translation from beliefs to behaviors is distributed across a population.


Where does culture exist?


Even if the corporation leader says that the ethos is “frugality” and whether the ethos is “frugality” is only the ethos insomuch as it’s what people believe is a desirable behavior and that’s what they pursue. It’s an empirical question. The fact that Amazon believes in frugality and Jeff Bezos tells us that’s what they believe, that tells us nothing about how they behave unsupervised. What really matters is the extent to which there’s buy-in and the extent to which this buy-in is held together in equilibrium with the normative behaviors of others.

No one at Enron said the ethos was to cheat and to steal money. They had very beautiful “serve the customer” or whatever that they put on their walls. But there’s a big difference between saying it and creating the normative environment that actually rewards those behaviors.

I think every culture is a complex system and is a complex equilibrium of norms, where the norms are are basically the enacted behaviors and beliefs, which the privately held perceptions that lead to these.

SCG: You mention here the difference and relationship between cognitive and behavioral cultural fit. When you’re measuring people’s emails in this research project, that’s their behavioral cultural fit, right? Could you please explain how cognitive and behavioral cultural fit are related and how the differences between them may end up mattering, for a company or for an individual’s success?

Prof. Goldberg: The prior we come into in the research is that they’re correlated, but the more we’re scraping beneath the surface we realize that that assumption may be incorrect. And this harkens back to some of the most fundamental work in sociology by an ethnographer, Erving Goffman: The Presentation of Self in Everyday Life. He has this metaphor of the backstage and the front stage. It’s known as the “dramaturgical model” of social interaction- that basically every interaction is a form of performance and in every interaction people present themselves. When they interact there are lots of thoughts going on in their heads that they are not communicating and their “performance” in the interaction needs to adhere to a certain code about what’s appropriate and what’s not.

One thing that came out of that research were these “breaching experiments”. When people behave in really weird ways. There are very clear expectations about what is appropriate behavior and what are various signals about one’s expectations about how they expect from their interlocutor to behave, even in a transient interaction. I think it’s those moments where these codes are breached that make salient to us how much of these behaviors is codified.  

One of the realizations we’ve come up to in the work we’ve done is that an important dimension is a person’s capacity to read the code- to understand what is appropriate and normative and to have a mental model of what the other expects.

SCG: In an organization, would you say that only the behavioral cultural fit matters?

Prof. Goldberg: It’s not the only thing that matters. It matters for the individual because that is what others see. That’s what’s communicated about the implied beliefs of the individuals. We know there are some people who are chameleons, they’re very good at adapting. They think to themselves, “I don’t buy into this place. I hate this place. But I’m going to behave as if I do.”

This is often referred to in psychology as “self-monitoring”. It’s one’s capacity to monitor their authentic self and to present themselves in a way which is congruent with what their interlocutor’s expectations are. And how do you infer what their interlocutor’s expectations are? As a function of their interlocutor’s behavior as well. So there’s a delicate equilibrium, and as long as the expectation is held, those who are capable of strategic action, those who are good code readers, will behave in a way that is congruent with the code, irrespective of their private beliefs.


Culture can sustain a handful of people who are fakers, but overall it is difficult.


On average you would expect if there is a significant incongruence between beliefs and behaviors across the population then that culture is not sustainable. Culture can sustain a handful of people who are fakers, but overall it is difficult.

SCG: Is this something that you’re working on?

The science is relatively early in understanding what are the conditions in which there is or is not congruence between the private and the public self and what are the implications. We have a new paper that is precisely about that. We call it “lifting the curtain”. It’s about this metaphor.

Thinking about complexity, in the aggregate these are nonlinear relationships. Imagine you can measure the difference between the behaviors and the private beliefs of an individual and average them across the organization. I imagine there would not be a linear relationship between that and the strength of the culture, or between that and the likelihood of that culture collapsing. We know with complex systems, these complex interactions can lead to phase transitions.

SCG: Can you talk how your research might be applied in a corporate setting? When we’re talking about cultural fit there are also other demographic factors that affect whether or not someone fits in. If someone is 65 and they work at Snapchat, they may have difficulty fitting in because they’re misreading cues but they also might not be fitting in because they’re 65 and the average age is something like 22. You also mentioned in your talk that women tend to match better but tend to be rewarded for it less.

Prof. Goldberg: There’s a relationship between socio-demographics and cultural preferences. Sometimes, socio-demographics are good proxies for cultural preferences but sometimes they’re not. This is different than saying that people interact with people of different social categories in discriminating or compensating ways. So some people might be gender discriminatory in certain ways and that’s going to be irrespective of that woman’s behaviors towards them. But when we think more broadly about what are the implications, we have a more refined tool looking at cultural fit above and beyond crude socio-demographic categories. There’s a way to see first of all to test which social categories are homogenous or not homogenous in their capability to fit or things of that sort.

I think the example that I gave about gender is not so much related to women’s capability of women’s ability to read the code, it’s about gendered biases about how the behaviors of women are interpreted. That are not specific to this firm but are specific to American culture- one of the most gender-equitable cultures but is still a misogynistic culture- there’s probably a lot of variance in misogyny or whatever you want to call it across organizations.


I don’t want someone to create a Minority Report kind of world.


But I’m a little wary about implications. I don’t want someone to create a Minority Report kind of world where they come into an organization and fire people because our algorithm suggests that they’re going to get fired later.

In terms of the implications, what is my responsibility here? I think scientists need to think about the implications of what they do. So, I’m not going to give myself a free pass. That’s why it’s important whenever I talk in public to say I would be averse to having this technology being used in any way, shape, or form to affect the lives of individuals. I think it would be a diagnostic tool for understanding organizations or maybe measuring the cultural health of an organization but I would be strongly opposed to it being used to determine the fates of individuals.

I think one message that comes out of our study is that the vast majority of the way corporate leaders think about culture is that they think about cultural fit. For example, they hire for cultural fit. I think that’s a part of it, yes, but another really important part but that there are people who are capable of adapting. We might call it faking it which might seem like a bad thing, but if we accept that all interactions are a performance, we’re all faking it all the time. So, maybe it’s important to have people who are capable of faking.

I think it’s easier to measure cultural fit at entry, it’s harder to measure people and to think systematically how they’re fitting in over time. I think that’s one implication is that we need to think about how to manage our organization that’s attentive to post-hire cultural change. I think it is my responsibility and it’s important to consider implications because the finding are sexy, the curves are beautiful. But a 95% confidence interval is a 95% interval. There a lot of people who fall outside of the confidence interval and we need to remember that. Small changes might have phase transition effects and I don’t want anyone to get fired for being slightly outside the confidence interval.


Data is not panacea.


Data is not panacea. The thought that we would just come with algorithms and solve all our problems is a frightening and misconceived thought. First of all, data is only as good as the quality of real phenomena that it represents. But second, it’s only as good as the analysis it’s applied to. Every analysis through modelling decisions makes assumptions. If we then take the results of these modellings as objective truths about the world without assuming that our assumptions are built into them, and we affect people’s lives, then we create a terrible world. So, I’m all in favor of people analytics. But only insofar as they’re deployed responsibly and when there’s human decision involved in the process. I do not want to concede authority to bots and algorithms, including the ones that I’ve produced.


To learn more about Professor Goldberg’s work you can check out his talk from the Stanford Complexity Symposium on November 14, 2017 (https://www.youtube.com/watch?v=BvdAjwDjeJo) or see his 2017 paper, “Enculturation Trajectories: Language, Cultural Adaptation, and Individual Outcomes in Organizations” (https://pubsonline.informs.org/doi/pdf/10.1287/mnsc.2016.2671).


“It’s Complicated…” — guest blog post by Meredith Tromble

January 24th, 2018

This is a guest post about our recent Complexity Symposium written by Meredith Tromble, an Associate Professor of Interdisciplinary Studies at San Francisco Art Institute. We were also lucky to have Prof. Tromble display some of her art at our Symposium! More on Professor Tromble can be found here: http://meredithtromble.net/


“It’s Complicated…” by Meredith Tromble

When animal behaviorist Kelly Finn forwarded the announcement for “It’s Complicated,” I just signed up. Kelly and I know each other through University of California, Davis, where I am artist-in-residence at the Complexity Sciences Center; I was curious what a new-to-me group of researchers would have to say about “The Relationship of Complexity Theory to Normative Discourse in Science, Society, and Beyond.” Of the three big questions that the organizers used to focus the day (“What is ‘Complexity Science’,” “How is Complexity Science integrated into various disciplines,” and “How does Complexity Science affect how we solve scientific, social, or philosophical problems?”) the question that sold me on coming was the third one. The symposium delivered a cornucopia of provisional answers and, even more fruitfully, an abundance of questions that could lead to even more refined and successful solutions.

The talks were generally excellent. In the three that were most immediately useful to me as artist explorer of science, self-organization researcher Carlos Gershenson opened with a lucid introduction to complexity as a philosophical concern, biologist philosopher Rasmus Grønfelt Winther compressed the soul of a very substantial forthcoming book into a mash-up of cartography, feminist science studies, and mapping genetics, and historian Jessica Riskin recounted the banishment of historical explanation from natural science in an illuminating talk that should soon become substantial book if it is not already. I was familiar with the food web work of biologist and network scientist Neo Martinez, but hearing about it in close proximity to biologist Deborah Gordon’s talk on algorithms that determine collective behavior hinted at questions about biology and scale worth pursuing further. If John Harte’s discussion of maximum entropy quickly outstripped my ability to follow in detail, I enjoyed the challenge, and other presentations such as the trio of talks on earthquake prediction were accessible for the mathematically less-trained.

Reflecting on the day, I imagined our knowledge world — all the huge universe of things that artists, scholars, and scientists are making, writing, and researching — as a network, one where each discipline is a lively hub, getting richer and deeper by the day into its subject, be it sculpture, biology or history. But the network also has links — moments such as the symposium where people who are expert in one field allow themselves to be instructed by other fields. Energizing these links has a great deal to do with the resiliency of the network as a whole. One of the books that continues to inspire me is an account of just such a gathering in 1968, Mary Catherine Bateson’s Our Own Metaphor: A Personal Account of a Conference on the Effects of Conscious Purpose on Human Adaptation (1972). “It’s Complicated…” continued that tradition.

Gossip: Identifying Central Individuals in a Social Network

April 26th, 2017

Because dynamics of networks depend on the small interactions between many components, networks are an important topic in complexity theory. Social networks describe individuals and their connections to one another. Professor of Economics at Stanford, Matthew O. Jackson is interested in how social network shape affects the flow of information on a network.

Stanford Complexity Group hosted a seminar entitled “Gossip: Identifying Central Individuals in a Social Network” by Matt Jackson on 3/13/2017. His talk focused on his research on microfinance efforts in a collection of rural south Indian villages. Advertising via means that might be more obvious in a Western context- for example via the internet or printed posters- are less effective in these villages, in part due to low literacy rates. Instead, more effective efforts of advertising the availability of these microfinance loans rely on person to person communication. Spreading information person to person over a social network relies on identifying who are the “most central” individuals in a social network. But what does it mean for an individual to be “central”? How do different measures of centrality lead to differences in the spread of information flow on the network?

Professor Jackson and his co-authors on the study simulated the flow of information over the social network structure from actual villages where the information spread from most “central” individuals and centrality was measured in different ways. “Degree centrality,” the number of edges a node has, was not particularly effective in identifying individuals who could spread information over the network. Identifying individuals by their eigenvector centrality was a more effective centrality measure for identifying good candidates to spread information over the network. Eigenvector centrality takes into account the first-degree connections of an individual, like degree centrality, but also considers the connectedness of the individual’s neighbors.

Eigenvector centrality, however, assumes that the probability of someone sharing information doesn’t decay with time. Thus, they developed a new model of centrality- diffusion centrality, which is parameterized by time.

Ultimately, the different measures of centrality rely on delineating the entire social network. In reality this would not be an effective strategy for a company to figure out how to best advertise their service (consider: a representative going door to door survey each family’s connectedness might as well advertise the service to each individual family. This quandary motivated another tool for measuring centrality- gossip. By asking few members of the community to identify who they had heard news about recently. They termed this measure of centrality “communication centrality”. Individuals with high “communication centrality” were often also those who were most central when centrality was defined by “diffusion centrality.”

Testing these theoretical results in a few dozen South Indian villages, Matt Jackson and collaborators considered the participation in microfinance when the loans were advertised to a few “central” individuals, using the different measures of centrality. While degree centrality did not have a high correlation with the percent of individuals participating in microfinance from a village, both “diffusion centrality” and “communication centrality” did.

While the results of these projects may be specific to microfinance in small villages, the question of how to spread helpful important information over a network extends to other types of social networks as well. Matt Jackson is also interested in studying the flow of information over larger web-based social networks, for example, how simple parenting tips like talking to babies from an early age, could effectively spread through social networks.

Matt Jackson’s work is a fascinating example of applying complexity theory to real world problems and we thank him for a wonderful seminar.

Discussion of “Racism 4.0, Civity, and Re-Constitution” by Palma Strand

October 14th, 2016

Next week, we will be hosting a visit by Palma Strand. In preparation for her seminar, we discussed Strand’s 2015 paper “Racism 4.0, Civity, and Re-Constitution”. The paper can be found here. This blog post is a summary of what we talked about. If you are curious about complexity and law, or perhaps “Applied Chaos Theory”, then read on…

First, we talked about the content of the paper and sketched out Strand’s framework. The overarching model in this paper is that we can think about the functioning of a nation in two parts: a “software” component (individual behavioral biases), which runs on a “hardware” component (structural & legal inequalities). Focusing on Blackness and Whiteness in the USA, Strand describes 4 iterations of the “operating system” of inequality. Racism 1.0 was the period of slavery, 2.0 was the post-slavery period of domestic terrorist acts such as lynching, 3.0 was typified by Jim Crow laws, and today we find ourselves in Racism 4.0. In today’s Racism 4.0, the overt manifestations of discrimination are nominally illegal (due to the Civil Rights legislature), yet we still have enormous social/economic inequality.

The main question is now: how do we proceed towards equality and justice? Strand suggests that the way forward should be guided by a pair of concepts: “Civity” and “Re-constitution”.

Civity” is a pragmatic sociological philosophy that addresses problems in the “software” branch of racism, for example by forming positive-valence interracial relationships. “Re-constitution” is a morally-imperative legal framework that addresses problems in the “hardware” branch, for example by directing housing organizations to normalize past patterns of housing inequality.

In the following image, the top branch is “software”, representing individual-level behavioral bias. This bias consists of pro-White & anti-Black components, and is addressed by Civity. The bottom branch is “hardware”, representing structural/legal inequality. This inequality consists of pro-White & anti-Black components, and is addressed by Re-constitution.


Civity is based on the creation of cross-cutting relationships. In the left side of the following image, a social network with mainly white nodes becomes tied to a social network with mainly blue nodes, after the creation of a strong link between the hubs. On the right, two smaller networks are not connected to each other, but each network is demographically admixed.


We saw several implicit and explicit parallels between Strand’s perspective on inequality and the Complex Adaptive System (CAS) framework. Points of harmony include:

  1. Non-linear emergent outcomes of a collective system. In society (as in other CAS systems), many distributed agents act on local information, generating system-level outcomes. The system-level outcome may be quite unpredictable from only knowing the agent-level rules, and vice-versa. This can be modeled with agent-based models.
  2. Path dependence. Strand discusses the role of past inequalities in current decision-making, for example in cases of affirmative action. Many CAS systems have a tremendously long memory of the environment. This is sometimes called “path-dependence”. For systems with high path-dependence, the response to a given stimulus is strongly contingent on the past experiences of the system.
  3. Network thinking. One manifestation of Civity is the restructuring of small-world social networks to include positive-valence interracial bridges between hubs. Many CAS systems are represented by networks, and network/graph theory is a long-time playground for CAS theory development.
  4. Anticipation. Many CAS systems do not passively react, they actively pre-act. For example, the pancrease starts to ecrete insulin when we see food, not when our blood sugar levels rise. Some CAS systems form models of the world around them, and act based upon predictions about the way the world works (Bayesian brain). On page 784 there is a nice discussion about how housing agencies can become more proactive to nip problems in the bud, instead of simply responding to crisis-level inequality. This cybernetic perspective on law might facilitate more effective treatments of social problems.
  5. Multiple spatial-temporal scales with feedback loops. Many CAS systems are integrated in behavior across multiple spatial scales. For example cells cooperate to generate tissue function, and tissues cooperate to generate organismal behavior. Other examples might include the relationships between local, state, and national government. Developmental complex systems are like embryonic Russian matryoshka dolls — hierarchically-structured dynamic matter, resplendent with feedback loops between scales, effortlessly giving rise to self-organized beauty.

After summarizing the main points of the paper and brainstorming points of harmony with CAS, we had a few open questions. To give two examples:

First, we wanted to learn more about how the many governmental scales of action can be synchronized for the good of the group. For example which equitable actions should be performed within Palo Alto, and which between East Palo Alto and Palo Alto? Which actions should include all of the Bay Area, California, or the country?

Second, we were inspired by the citations to a literature corpus that we were less-than-familiar with: non-quantitive perspectives on complexity. For example, there were citations about leadership complexity, organizational structure, and complexity in legislature. We wanted to know about the historical and current patterns of citation/idea-exchange between the quantitative and non-quantitative complexity literature.

All in all, it was a great paper and interesting discussion. Strand’s model is both pragmatic and normative about social justice. Without normativity, a social model is irrelevant  But without pragmatism, a social model is impractical. Thus Strand’s framework can serve the academic, but more importantly, aims for the service of the population.


Dr. W. Brian Arthur – Complexity and the Economy

March 3rd, 2016

Dr. W. Brian Arthur gave the 2015 Fall Complexity Seminar. His talk was titled Complexity and the Economy. Dr. Arthur spoke about the early days of this effort at SFI and explain how this new economics works and why it is needed. Complexity gives a view of the economy not as a perfectly balanced, smoothly functioning machine, but as a system that is organic, evolutionary, ever-changing, and historically-contingent.

Watch the video of his talk here: