Process
Status Items Output None Questions None Claims None Highlights Done See section below
Highlights
id651265429
This game is now known as the prisoner’s dilemma. So let’s play a game. A banker with a chest full of gold coins invites you and another player to play against each other. You each get two choices. You can cooperate or you can defect. If you both cooperate, you each get three coins. If one of you cooperates, but the other defects, then the one who defected gets five coins and the other gets nothing. And if you both defect, then you each get a coin.
✏️ doesn’t it feel inherently skewed that we’ve decided a defector gets rewarded while cooperator loses? What if the basis is that the cooperator gets the full points, and the defector loses? What dictated the original dissemination of rewards? Was it a skewed perspective/propaganda of favoring defection over cooperation? why is being selfish a higher point value than helping another? Is that accurate or merely a pessimistic perspective of things? 🔗 View Highlight
id651265613
Axelrod found that all the best performing strategies, including Tit for Tat, shared four qualities.
✏️ Based on 80s competition involving multiple programs, certain traits seemed successful:
- Nice — nice being that you never initiate defection, and nasty you do
- Forgiving — forgiving being that you retaliate but you never held a grudge and kept defecting — seems that, if anything, being extra forgiving is even better. The winning strategy would retaliate after each defection then move on, but you’d get more points if you retaliated after two successive defections then moved on.
- Retaliatory — don’t be a pushover. Strike back immediately.
- Clear — if you’re notions and intentions and actions are unclear, it’s hard to figure out patterns and trust, and so it’s less effective. Being clear in your actions and intentions leads to clear responses. 🔗 View Highlight
id651266131
Imagine a world that is a really nasty place to live, more or less populated with players that always defect, except there’s a little cluster of tit-for-tat players that live in some kind of nucleus and they get to play with each other a lot because they’re geographically sequestered. They will start building up a lot of points, and also because that translates into offspring, they’ll start to take over the population. So in fact, Axelrod showed that a little island of cooperation can emerge and spread and eventually will take over the world, which is fantastic. How can cooperation emerge in a population of players who are self-interested? Who are not trying to be good because they’re good-hearted. You don’t have to be altruistic. You could be looking out for number one for yourself and your own interests. And yet cooperation can still emerge. (bright music) - Some argue that this could explain how we went from a world full of completely selfish organisms where every organism only cared about themselves to one where cooperation emerged and flourished. From impalas grooming each other to fish cleaning sharks. Many life forms experience conflicts similar to the prisoner’s dilemma, but because they don’t interact just once, both can be better off by cooperating. And this doesn’t require trust or conscious thought either because the strategy could be encoded in DNA, as long as it performs better than the other strategies, it can take over a population.
✏️ Studies show that being a small cluster of tit-for-tat and cooperation (in a world of self-interest) can succeed and grow and take over the whole thing… So, how can cooperation emerge in a population of self-interest? In the short term, the environment impacts/influences the players. In the long term, the players impact/influence the environment. 🔗 View Highlight
id651266169
What happens if there is a little bit of random error in the game? Some noise in the system. For example, one player tries to cooperate, but it comes across as a defection. Little errors like this happen in the real world all the time. Like in 1983, the Soviet satellite-based early warning system detected the launch of an intercontinental ballistic missile from the US but the US hadn’t launched anything. The Soviet system had confused sunlight reflecting off high altitude clouds with a ballistic missile. Thankfully, Stanislav Petrov, the Soviet officer on duty, dismissed the alarm. But this example shows the potential costs of a signal error and the importance of studying the effects of noise on those strategies.
id651266786
When Tit for Tat plays against itself in a noisy environment, both start off by cooperating, but if a single cooperation is perceived as a defection, then the other Tit for Tat retaliates and it sets off a chain of alternating retaliations. And if another cooperation is perceived as a defection, then the rest of the game is constant mutual defection. Therefore, in the long run, both would only get a third of the points they would get in a perfect environment. Tit for Tat goes from performing very well to performing poorly. So how do you solve this? Well, you need a reliable way to break out of these echo effects. And one way to do this is by playing Tit for Tat, but with around 10% more forgiveness. So instead of retaliating after every defection, you only retaliate around nine out of every 10 times. This helps you break out of those echoes while still being retaliatory enough to not be taken advantage of.
✏️ Mistakes and miscommunication happens in reality.. so one must account for mistakes and a percentage of error. In this case, tit for two tats would be more successful because it can forgive those errors.. in your list of traits, being forgiving is more important than being retaliatory. 🔗 View Highlight
id651264513
My favorite example is Tit for Tat does really well, but it could never do better than the player it’s playing with. - I mean, think about it, by design, all they can do is lose or draw. And yet when the results of all interactions are tallied up, they come out ahead of all other strategies. Similarly, always defect can never lose a game. It can only draw or win, but overall, it performs extremely poorly. This highlights a common misconception because for many people when they think about winning, they think they need to beat the other person. In games like chess or poker, this is true since one person’s gain is necessarily another person’s loss, so these games are zero sum. But most of life is not zero sum. To win, you don’t need to get your reward from the other player. Instead, you can get it from the banker. Only in real life, the banker is the world. It is literally everything around you. It is just up to us to find those win-win situations, and then work together to unlock those rewards.
id651264979
You see, in the short term, it is often the environment that shapes the player that determines who does well. But in the long run, it is the players that shape the environment
✏️ It feels like cooperation is just a good idea, regardless of the concept of civilization and society and so on. This punches through veneer theory because these traits exist across species and programs. Regardless of your environment, with enough time and support, you can cooperate your way to victory. 🔗 View Highlight