Hello from Morningside! Here were my thoughts over the last month.
☕️ 3/13
One of my favorite aspects of British culture is hearing U.K. rappers rap fondly about their “cuppa.”
🌎 3/14
Accelerated summaries of the Universe’s history are delightful because they encapsulate so much in so few words. It’s like a massive genie in a tiny bottle, promising to grant any wish: if all of that was possible, then anything must be. Who did it best: The Big Bang Theory’s intro, Jens Lekman, or Bill Wurtz?
🗳 3/16
My interpretation of Scott Alexander’s forecasting survey results is that scientific consensus is the only truth-finding mechanism that reliably beats democracy. Frustratingly, there doesn’t seem to be a simple way to harness it for choosing world leaders and policies without somehow corrupting the process of eliciting it.
🧱 3/17
Where Donald Trump promised “The Wall,” a self-consistent Ron DeSantis might someday promise “The Firewall”: an American analog to authoritarian internet filters like those used by China or North Korea. Seems like it would fit well with his pro-censorship and anti-Big-Tech tendencies.
⚙️ 3/18
Sam Harris’s conversation with Stuart Russell and Gary Marcus, The Trouble with AI, offers a great portrayal of the main drawback of massive neural networks: correlated failure cases. Advanced Go models, many times better than the one that famously defeated Lee Sedol, can be beaten with simple, human-understandable strategies discovered by adversarial search algorithms. Image generation models struggle with drawing text and hands. And models like GPT-4 fail to multiply large numbers or perform simple counting tasks, and hallucinate frequently.
There’s a clear lesson here: there is no such thing as infinite reliability. A perfect model would be indistinguishable from what is being modeled: to be maximally super-intelligent is to be the Universe.
Instead, any first-generation model whose intelligence rivals that of humans will have similar obvious pitfalls: an over-reliance on mental shortcuts to make sense of certain aspects of the world. That model will be as much a risk to itself as it will be to us.
🦪 3/20
I’m considering a midyear resolution: to become ostrovegan. Oysters, clams, and mussels all seem about as unlikely to feel pain as plants do, so they’re essentially meat crops. That means they’re a totally valid way to fill out the nutrients that veganism misses. I think I appreciate the elegance of this solution just enough to finally try eliminating cruelty from my diet!
My plan is to take it slow and steady. I’ll go about eliminating one food group each month until I’m 100 percent there. This month will be pork — perhaps the smartest of the animals that suffer for human diets — I’ll check back in with y’all next month on how it goes.
🧑🎨 3/28
“Agent Designer” will be the next hot job title, representing end-to-end dataset, model, fine-tuning, prompt, and chain engineers who know how to isolate a large language model’s scope of reliability to a useful specialty.
🧋3/30
KOI Thé is an amazing Taiwanese bubble tea chain that’s all over Vietnam and apparently has a location in Union Square (!) Sadly, it won’t have their Vietnam-specific “Dark Lava,” which boasts an incredible technology: crispy bits of chocolate that don’t get soggy in tea.
🏪 4/1
One of the difficulties of planning virgin urban territory is predicting how communities will occupy the space. While on the train of thought that led to last week’s post, I realized this is probably especially hard in three-dimensions, where there are virtually no examples of organic growth to learn from (almost all vertical structures had to have had their plumbing, electricity, and floor plans decided well before construction).
One potential solution: train a model to simulate the organic growth of thriving neighborhoods that are flat (based on the immense wealth of data that exists around those), then project the model’s predictions onto a three-dimensional lattice that factors in transportation modes like elevators and escalators. In other words — swap out the model’s internal representation of distance, and get out optimally planned skyscrapers.
🏙 4/5
I learned that Bangkok is an extreme primate city, which besides sounding very funny means it’s much bigger than the other cities in its country.
🥊 4/6
Francis Ngannou has one of the most amazing stories ever: going from living in extreme poverty, migrating by boat to Europe, and experiencing jail time and homelessness upon arrival, to becoming the reigning UFC heavyweight champion. I highly recommend hearing the story from the man himself — his calm in the face of hardship is the mark of a true fighter. Check out this interview of him, produced by my friend David Zha.
🥐 4/13
This appreciation of French culture by a visiting Italian fills me with pride: