1. Summer Update: Tech and the New Normal by Ben Evans. An update to his trend deck based on the global changes brought on by COVID-19. Tech adoption has accelerated both from the time spent on devices to the consumption channels used. “For those who can survive and take advantage, huge markets are being reset and rethought.”
  2. The Things We Can’t Control Are Beautiful by Kevin Berger for Nautilus. An interview with poker player Maria Konnikova on decision making under uncertainty, our reluctance to pay respect to mere chance in storytelling, and why such randomness brings value.
  3. GPT-3 Creative Fiction by Gwern. “[…] GPT-3’s samples are not just close to human level: they are creative, witty, deep, meta, and often beautiful.” Particularly eerie is GPT-3’s take on Alan Ginsberg’s poem “Moloch”.
  4. What I Learned from Losing $200 Million by Bob Henderson for Nautilus. During the 2008 crisis, the derivatives trader Henderson found his bets turning into never-ending losses, challenging his assumptions on how to manage risk. 
  5. Reflecting on a Year of Making Machine Learning Actually Useful by Shreya Shankar. Shankar points out how – besides the much-lauded models – data augmentations, interpretability for debugging, reproducibility and replicability, as well as correct ML pipelines improve the value of machine learning in production.
  6. Gödel’s Legacy: A Game Without End by Hazard at LessWrong. On the impact of Gödel’s Incompleteness Theorem on maths and problem solving. “The more adversarial the context, the more the boundaries dissolve. Gödel’s legacy is to show us that there is no limit to how far the boundaries can dissolve.”
  7. Tech Trend Radar 2020 – Stay Prepared Amidst Uncertainties at MunichRE. Great summary and visualization of the tech trends to watch, try out, and adopt. 
  8. Image Search with Text Feedback by Visiolinguistic Attention Learning by Chen et al. (2020). Visiolinguistic Attention Learning (VAL) uses a text encoder network and an image encoder network, fuses the text and image features, and thereby can suggest similar products (the paper focused on fashion) based on the customer’s text-based feedback.
  9. Why Computers Won’t Be Reading Your Mind Any Time Soon by Nicole Kobie for Wired. Brain-Computer-Interfaces are still facing a myriad of challenges, from invasiveness to overblown reporting. 
  10. What Can I Do Here? A Theory of Affordances in Reinforcement Learning by Khetarpal et al. (2020). The Theory of Affordance states that intelligent agents perceive an object with the possible actions tied to it. With a new twist on reinforcement learning, DeepMind wants to integrate this knowledge to reduce complexity and help generalize across environments.