Search results
Results from the WOW.Com Content Network
Stable Diffusion is a deep learning, text-to-image model released in 2022 based on diffusion techniques. The generative artificial intelligence technology is the premier product of Stability AI and is considered to be a part of the ongoing artificial intelligence boom .
Hugging Face, Inc. is an American company incorporated under the Delaware General Corporation Law [1] and based in New York City that develops computation tools for building applications using machine learning.
Stability AI was founded in 2019 by Emad Mostaque. [2] [3] [4]In August 2022 Stability AI rose to prominence with the release of its source and weights available text-to-image model Stable Diffusion.
LAION (acronym for Large-scale Artificial Intelligence Open Network) is a German non-profit which makes open-sourced artificial intelligence models and datasets. [1] It is best known for releasing a number of large datasets of images and captions scraped from the web which have been used to train a number of high-profile text-to-image models, including Stable Diffusion and Imagen.
The Latent Diffusion Model (LDM) [1] is a diffusion model architecture developed by the CompVis (Computer Vision & Learning) [2] group at LMU Munich. [ 3 ] Introduced in 2015, diffusion models (DMs) are trained with the objective of removing successive applications of noise (commonly Gaussian ) on training images.
LoRA-based fine-tuning has become popular in the Stable Diffusion community. [14] Support for LoRA was integrated into the Diffusers library from Hugging Face. [15] Support for LoRA and similar techniques is also available for a wide range of other models through Hugging Face's Parameter-Efficient Fine-Tuning (PEFT) package. [16]
In August 2022, the company co-released an improved version of their Latent Diffusion Model called Stable Diffusion together with the CompVis Group at Ludwig Maximilian University of Munich and a compute donation by Stability AI. [14] [15] On December 21, 2022 Runway raised US$50 million [16] in a Series C round.
BigScience Large Open-science Open-access Multilingual Language Model (BLOOM) [1] [2] is a 176-billion-parameter transformer-based autoregressive large language model (LLM). The model, as well as the code base and the data used to train it, are distributed under free licences. [3]