Prompt is an instruction that we give the LLMs.
Even though GPT models are quite good at handling the unstructured instructions, there are several techniques that allow to directly influence the model behavior.
Prompts don't have to be grammatically correct. LLM usually will understand it probably. This can save tokens.