A family of pretrained and fine-tuned language models in sizes from 7 to 70 billion parameters.