Roman Suzi
1 min readJan 10, 2023

--

I've tried many things with ChatGPT. It's possible to make it draw with Logo language or create graphs with graphviz. It is even possible for it to write simple proofs (like x + y = y + x) for a theorem proof assistant, but LLM is just a huge parrot. It "knows" only what has been put there.

Well, I can imaging some human software devs are also parrots, copy-pasting code from here and there, reacting to "code reviews" by promising to correct an error only to produce another one.

However, language capabilities are only part of equation. AI can be nice tool for software development, and, quite possibly, AI-adapted programming languages will emerge to facilitate programming by machines for machines.

Still, LLM is a huge reservoir of "common sense", and real life ontology. So I suspect making e.g. first draft of ontology can be greatly accelerated by summarizing knowledge with AI. For well-known domains AI can serve as a subject matter expert. "Searching documentation" can also be helped.

This is cool and useful development, but way to go to make smart humans obsolete.

If AI will help solving formal verification problem for software, that would raise software quality tremendously. Human wisdom will be needed as long as there are humans on this planet though.

--

--

No responses yet