1. Pinduoduo and The Rise of Social E-Commerce by Anu Hariharan and Nic Dardenne at YCombinator. Excellent article on the Chinese Pinduoduo and the features that make it so successful. For a deeper analysis of social e-commerce, have a look at the whitepaper by ChinaChannel.
  2. The Importance of Being Causal by Bojinov et al. (2020) for Harvard Data Science Review. Four case studies of how Linkedin added value to their product through insights gained with observational causal inference.
  3. Matt Botvinick on the Spontaneous Emergence of Learning Algorithms by Adam Scholl at LessWrong. Under certain criteria, RNNs create reinforcement learning algorithms — unprompted. This behaviour can also be found in the human brain. 
  4. Gut Instinct: How Your Diet Shapes Your Mind by David Cox for The Guardian. The enteric nervous system contains more than 100m neurons and the vagus nerve signals 80% to the brain, and 20% from brain to gut, demonstrating that the brain is predominantly a receiver of gut signals. 
  5. Even the Best AI Models Are No Match for the Coronavirus by Will Knight for Wired. While quantitative models performed badly in March, integrating alternative data sources such as satellite imagery, flight and shipping information, and social media content is a promising attempt to detect future market volatility.
  6. Philosophers On GPT-3 by Justin Weinberg for Daily Nous. Brief thoughts by philosophers such as David Chalmers on the impact and meaning of GPT-3. 
  7. Zen and Rationality: Don’t Know Mind by G Gordon Worley III at LessWrong. The Zen idea of “Don’t Know Mind” under a rationalist perspective.
  8. The Lessons We All Must Learn from the A-Levels Algorithm Debacle by Matt Burgess for Wired. The Ofqual algorithm used to grade British students is a prime example how flawed output loses public trust.
  9. Aren’t We Smart, Fellow Behavioural Scientists by Jason Collins. The replication crisis threw a shadow over behavioural science. Collins highlights some of the probable causes. 
  10. State-of-the-Art Language Models in 2020 by Hong Jing for Towards Data Science. A selection of current models for different NLP tasks: Transformers, BERT, and XLNet.