Ilya!!!!
www.dwarkesh.com/p/ilya-sutsk...
Ilya!!!!
View original threadthis is the theme — you can’t have AGI without existing in and learning from the real world
20
2
ahh yes, these are the sort of quotes I'm here for
22
1
dwarkesh nervously: "ahahaha yes of course, the value function for human emotions, makes perfect sense"
15
"Emotions are so simple, it would be cool to map them out in a human understandable way" <>
—Ilya
—Ilya
22
1
guys, i'm losing it over here. this can't be real
26
2 hours later
he must be doing it on purpose, just handing us quotable quotes on repeat for an hour and a half
15
let's stop picking on Ilya for a moment, this is a great point
scaling consumed so much economic attention, that it was difficult to convince your boss or a VC to let you look anywhere else
and that can't have good consequences, that surely leads to blind spots
scaling consumed so much economic attention, that it was difficult to convince your boss or a VC to let you look anywhere else
and that can't have good consequences, that surely leads to blind spots
15
1
"if you're really doing something different, do you really need the absolute maximal scale to prove it?"
Ilya explaining why $3B gives them a ton of compute for research. Other labs are spending huge amounts on inference + staff to serve, whereas all of SSI's is dedicated to research
Ilya explaining why $3B gives them a ton of compute for research. Other labs are spending huge amounts on inference + staff to serve, whereas all of SSI's is dedicated to research
6
3 hours later
context: AGI isn't here, that's the nature of AI, you're always talking about something that *will be* but isn't tangible today
7
it's called a bubble pop
10