I think this was the best write up on the impact of AI on software engineering I've read yet. By extension it might be the best 'take' on AI I've read period.
Alas it is not written by AI and boosted on X (which is owned by an LLM model vendor), and therefore will not get 80m views or whatever.
> Some observers have pointed out cases where CCC [sic] appears to regenerate artifacts strongly resembling existing implementations, including standard headers
Its not just that, we just had another thread here on HN recently discussing how the LLMs reproduce the entire work of Harry Potter with ~99% accuracy when prompted to do so with a jailbreak. This seems to contradict the “it’s remarkable progress” statement from the top of the article.
I don't think this compiler makes the argument it thinks it does: the LLMs are able to statistically reproduce its source material, and you cannot copyright things that were not produced by a human hand, and you cannot copyright things that are covered under the phone book ruling.
The future of software, if it is to be filled with slop, will also be uncopyrightable and stolen without attribution.
I think this was the best write up on the impact of AI on software engineering I've read yet. By extension it might be the best 'take' on AI I've read period.
Alas it is not written by AI and boosted on X (which is owned by an LLM model vendor), and therefore will not get 80m views or whatever.
> Some observers have pointed out cases where CCC [sic] appears to regenerate artifacts strongly resembling existing implementations, including standard headers
Its not just that, we just had another thread here on HN recently discussing how the LLMs reproduce the entire work of Harry Potter with ~99% accuracy when prompted to do so with a jailbreak. This seems to contradict the “it’s remarkable progress” statement from the top of the article.
Both can be true right? A model can be a savant memorizer _and_ a good reasoner?
LLMs can't reason because they fundamentally don't understand anything they generate.
How does that contradict the claim at all?
This is good and also parallels the other recent post: AI makes you boring.
I don't think this compiler makes the argument it thinks it does: the LLMs are able to statistically reproduce its source material, and you cannot copyright things that were not produced by a human hand, and you cannot copyright things that are covered under the phone book ruling.
The future of software, if it is to be filled with slop, will also be uncopyrightable and stolen without attribution.