“Apple released OpenELM, a family of open language models (270 million - 3 billion parameters), designed to run on-device. OpenELM outperforms comparable-sized existing LLMs pretrained on publicly available datasets. Apple also released CoreNet, a library for training deep neural networks”