Bluesky Thread

i tore this apart this morning, the gist:

View original thread
i tore this apart this morning, the gist:

- yes, it separates knowledge from reasoning 🎉
- it substitutes MHA computational complexity for knowledge graph schema design 🤔

i’m partly ecstatic, this is huge, but also disappointed bc KG design is largely unsolved
Sung Kim @sungkim.bsky.social
Microsoft's KBLaM, an approach that encodes and stores structured knowledge within an LLM itself. By integrating knowledge without retraining, it offers a scalable alternative to traditional methods.

www.microsoft.com/en-us/resear...
43 6
a big problem with the KG tie in is that KGs lose information, a ton

you have to boil everything down to entity-relation-entity triples, and a lot of knowledge (e.g. procedural knowledge) doesn’t fit into that

this probably made a lot more sense in the early days of LLMs
7 1
the achilles heel is that it goes against the bitter lesson

instead of throwing more compute at a problem, we solve it through brain power

but it’s worse. the bitter lesson was in regards to researchers, this would require ALL USERS to have superb KG design skills

that’s a bad scaling dynamic
6 1
ChatGPT sesh

my process:

1. read the paper, kinda, until i have unfounded confidence on the topic

2. spout my unfounded confidence into an o3-mini-high chat

3. let o3 bring me back to earth, touch grass

chatgpt.com/share/67db6a...
chatgpt.com
ChatGPT - KBLAM Mechanism Understanding
Shared via ChatGPT
6 1
43 likes 6 reposts

More like this

×