When working with transformers such as GPT-3, the inputs matter a great deal. They provide context for how outputs should be structured. These prompts can take a significant amount of time to create and debug.
Transformers take in an input string of text and output what they think most likely comes next. This makes them great for handling typos and processing complex sentences, but poor at any form of logic.