Increasing Complexity, Big History, and the Anthropic Principle
April 13, 2011 2 Comments
Looking around, we can see incredible levels of complexity in our world. Fascinatingly, things seem to have been getting more complex in our immediate surroundings. Starting with the Big Bang, David Christian gives a delightful TED Talk on how complexity has increased. His “Big History” approach focuses on the times that conditions were just right to pass ‘thresholds’, allowing new forms of complexity – more complex particles, life, and information.
I loved the way he weaved physics, chemistry, and biology into a continuous story. Slight variations at the beginning of the universe led to stars, which lead to diverse elements and the formation of planets, which (at least once, here) had the right conditions for life. More importantly, the conditions had to drift/develop into a narrow window for the next level of complexity to develop. Dense but not too dense, hot but not too hot, stable but not too stable.
That was a point which stood out to me – the conditions didn’t START perfectly for complexity right off the bat. After all, it’s been over 13 billion years. They changed slowly, with billions of slightly varying instances, and when the threshold was passed it triggered a cascade.
I got the sense that some of the transitions involved hand-waving, but I grant it’s not exactly easy to compress the history of the universe into 17 minutes.
His next step caught me a bit off guard though (around the 15:30 mark):
This is a powerful story. And it’s a story in which humans play an astonishing and creative role. But it also contains warnings. Collective learning is a very, very powerful force, and it’s not clear that we humans are in charge of it.
He goes on to raise the point that our collective learning is so powerful that we have the power to ruin the “Goldilocks Conditions” for increased complexity. Our discovery of nuclear weapons and overuse of fossil fuels could lead to dramatic shifts in our capacity for advancement.
I’m not sure that I like the word choice, implying that some other entity is ‘in charge of’ the story. Perhaps he meant to imply that humans might not be in full *control* of our power. Looking at the irrationalities of human thought and decision-making, it’s certainly possible that we’ll misuse the power with catastrophic results.
Nothing about the story implies that complexity is guaranteed to increase or that it’s necessarily a good thing (topics for another time). But it’s interesting to look at the various ways that conditions became *just* right to allow us to reach this stage, and what might end it.
If it weren’t for the anthropic principle you might start to get ideas…
(h/t Hemant at the Friendly Atheist)