Data Phoenix Digest - ISSUE 7.2024
Explore recordings of all Data Phoenix past webinars and AI highlights of the latest week.
Explore recordings of all Data Phoenix past webinars and AI highlights of the latest week.
Apple recently released the OpenELM (Open Efficient Language Model) family of models on Hugging Face. Other than the models' size, a key differentiator of the OpenELM models is the approach to non-uniform parameter allocation within each transformer model layer.
Microsoft released Phi-3-mini, a 3.8 billion parameter model small enough that it can be run in contexts with constrained hardware and network resources. Its strong performance is the result of carefully curated dataset-construction and training processes.
The Financial Times has struck a deal with OpenAI. The latter will pay an undisclosed licensing fee to access the FT's archive for training and have ChatGPT provide summaries and links to FT articles. The deal has come through as OpenAI faces increasing legal issues due to data-scraping accusations.