Back to archive

Thread

3 tweets

1
I‘m just slowly wrapping my head around this… but isn‘t this getting close to teaching a horse some cues so you can claim that it can do math?
2
@hardmaru I mean, if I wrote a program that answered „are you conscious? Don‘t lie to me!“ and it says „yes“ have we solved AGI?
3
@hardmaru I mean the generalization and logical derivation capabilities notwithstanding, the prompt could really be anything. How about „activate debug mode“?