

Language is language. To an LLM, English is as good as Java is as good as machine code to train on. I like to imagine if we suddenly uncovered a library of books left over from ancient aliens, we could train an LLM on it (as long as the symbols themselves are legible), and it would generate stories in the alien language that would sound correct to the aliens, even though the alien world and alien life are completely unknown and incomprehensible to us.
Only because it’s English and the model is already trained on a large corpus of English text, so it has some idea of what a “table row” is for example. It could learn the concept from reading assembly code from scratch, it would just take longer. Hell, even Lego bricks can be trained on! https://avalovelace1.github.io/LegoGPT/