Data Phoenix Digest - ISSUE 38
Distillation of BERT-like models, getting started with Comet ML, hands-on with SciKit-Learn feature-engineering, ensembling off-the-shelf models for GAN training, generative art using neural visual grammars and dual encoders, DSP-SLAM, jobs, and more ...
- Finetuning of GPT-3 Is Now Officially Allowed, says OpenAI
- France Forces Clearview AI to Eliminate the Data
- How ML and AI Work in the Travel Industry?
Paper Review: NL-Augmenter A Framework for Task-Sensitive Natural Language Augmentation
This paper review shares a look at a new participatory Python-based natural language augmentation framework that supports the creation of transformations and filters.
Distillation of BERT-Like Models: The Theory
Let’s explore the mechanisms behind the approach of DistilBERT, including 101, architectures, distillation loss, and other useful details you may need in your implementation.
Mixed Neural Style Transfer With Two Style Images
Neural style transfer (NST) is a fascinating field. Let’s learn how to apply the styles of two images to one photo, analyze the improvement process and show how to extend NST optimization.
Hyperparameter Tuning of Neural Networks with Optuna and PyTorch
In this article, you’ll learn how to tune hyperparameters in neural networks in PyTorch and how to find that perfect neural networks model with the help of Optuna.
Getting Started with Comet ML
Shout you give Comet ML a try? In this article, you’ll find an overview of this popular ML experimentation platform, with a practical example. And then you can decide!
Time-Series Analysis: Hands-On with SciKit-Learn Feature-Engineering
In this hands-on guide, you’ll look into time-series analysis using SciKit-Learn Feature Engineering, from exploratory data analysis to different features to modelling.
Ensembling Off-The-Shelf Models for GAN Training
Nupur Kumari et al. propose an effective selection mechanism for pretrained CV models. It allows to choose the most accurate model, and progressively add it to the discriminator ensemble.
Generative Art Using Neural Visual Grammars and Dual Encoders
In this paper, Chrisantha Fernando et al. present a novel algorithm for producing generative art. It allows a user to input a text string that outputs an image which interprets that string.
Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions
Implicit Maximum Likelihood Estimation (I-MLE) is a framework for end-to-end learning of models combining discrete exponential family distributions and differentiable neural components.
DSP-SLAM: Object Oriented SLAM with Deep Shape Priors
DSP-SLAM is an object-oriented SLAM system that builds a rich and accurate joint map of dense 3D models for foreground objects, and sparse landmark points to represent the background.
- Major Donor Data Specialist - Wikimedia Foundation (Remote)
- Sr. Data Engineer - HashiCorp (US - Remote)
- Machine Learning Architect - SoftServe (Odesa, Kyiv, Lviv...)
- Senior/Middle CV/ML Engineer - Apostera (Odesa, Kyiv, Remote)
- Data Scientist - Snap (Odesa, Kyiv)
Looking to feature your open positions in the digest? Kindly reach out to us at [email protected] for details. We'll be proud to help your business thrive!
Data Phoenix Newsletter
Join the newsletter to receive the latest updates in your inbox.