stokes
stokes|Oct 05, 2025 23:36
assume we do get _alignable_ AGI in the next few years. the most plausible mechanism from today’s standpoint seems to be “faithful chain of thought” (h/t AI 2027) does this mean we must rely on a pseudoreligious elite pouring over millions of pages of text to ensure alignment?(stokes)
+6
Mentioned
Share To

Timeline

HotFlash

APP

X

Telegram

Facebook

Reddit

CopyLink

Hot Reads