News
3rd issue of the Gradio newsletter! π
Every two weeks, weβll be sharing the latest news from the machine learning world, including the state-of-the-art machine learning models and demos that you can try right in your browser. Read on below π
- Gradio Version 2.7.5 β
- New Gradio Huggingface Spaces Demos β
- First Order Motion Model πΊ
- GLIDE: Towards Photorealistic Image Generation and Editing with Text-Guided Diffusion Models ποΈ
- MT3: Multi-Task Multitrack Music Transcription π΅
- JoJoGAN: One Shot Face Stylization π
- PaddleSeg - Matting π
First of all, if you havenβt already done so, consider taking 5 seconds and giving us a GitHub star to help our free, open-source library get more visible!
Gradio Version 2.7.5 β
We released Gradio version 2.7.5 with a lot of new features. Read the full release notes here, or weβve summarized the key enhancements below: π
Gradio version 2.7.5 includes several enhancements to the library: π
- Backend Migration to Starlette and FastAPI π οΈ
- (BETA) Custom Flagging Callbacks and HuggingFaceDatasetSaver π
- Small Fixes π§
π Huggingface Spaces π
π€ Spaces is out of beta, create your demos on Spaces using Gradio π Here are some good ones to check out.

glide-text2im
GLIDE: Towards Photorealistic Image Generation and Editing with Text-Guided Diffusion Models.

MT3: Multi-Task Multitrack Music Transcription
MT3 is a multi-instrument automatic music transcription model that uses the T5X framework.
- paper: https://arxiv.org/abs/2111.03017
- Spaces Demo: https://huggingface.co/spaces/akhaliq/MT3
- Spaces Code: https://huggingface.co/spaces/akhaliq/MT3/blob/main/app.py

JoJoGAN: One Shot Face Stylization
Only one single reference image is needed for training which takes about 1 minute on GPU in colab. After training, style can be applied to any input image
- paper: https://arxiv.org/abs/2112.11641
- Spaces Demo: https://huggingface.co/spaces/akhaliq/JoJoGAN
- Spaces Code: https://huggingface.co/spaces/akhaliq/JoJoGAN/blob/main/app.py

PaddleSeg - Matting
Easy-to-use image segmentation library with awesome pre-trained model zoo, supporting wide-range of practical tasks from research to industrial applications.