enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Unsupervised learning - Wikipedia

    en.wikipedia.org/wiki/Unsupervised_learning

    Unsupervised learning is a framework in machine learning where, in contrast to supervised learning, algorithms learn patterns exclusively from unlabeled data. [1] Other frameworks in the spectrum of supervisions include weak- or semi-supervision , where a small portion of the data is tagged, and self-supervision .

  3. Competitive learning - Wikipedia

    en.wikipedia.org/wiki/Competitive_learning

    Competitive learning is a form of unsupervised learning in artificial neural networks, in which nodes compete for the right to respond to a subset of the input data. [ 1 ] [ 2 ] A variant of Hebbian learning , competitive learning works by increasing the specialization of each node in the network.

  4. Conceptual clustering - Wikipedia

    en.wikipedia.org/wiki/Conceptual_clustering

    Conceptual clustering is a machine learning paradigm for unsupervised classification that has been defined by Ryszard S. Michalski in 1980 (Fisher 1987, Michalski 1980) and developed mainly during the 1980s.

  5. Self-organizing map - Wikipedia

    en.wikipedia.org/wiki/Self-organizing_map

    The examples are usually administered several times as iterations. The training utilizes competitive learning. When a training example is fed to the network, its Euclidean distance to all weight vectors is computed. The neuron whose weight vector is most similar to the input is called the best matching unit (BMU). The weights of the BMU and ...

  6. Machine learning - Wikipedia

    en.wikipedia.org/wiki/Machine_learning

    Semi-supervised learning falls between unsupervised learning (without any labeled training data) and supervised learning (with completely labeled training data). Some of the training examples are missing training labels, yet many machine-learning researchers have found that unlabeled data, when used in conjunction with a small amount of labeled ...

  7. Feature learning - Wikipedia

    en.wikipedia.org/wiki/Feature_learning

    An example of unsupervised dictionary learning is sparse coding, which aims to learn basis functions (dictionary elements) for data representation from unlabeled input data. Sparse coding can be applied to learn overcomplete dictionaries, where the number of dictionary elements is larger than the dimension of the input data. [ 21 ]

  8. Deep belief network - Wikipedia

    en.wikipedia.org/wiki/Deep_belief_network

    The observation [2] that DBNs can be trained greedily, one layer at a time, led to one of the first effective deep learning algorithms. [4]: 6 Overall, there are many attractive implementations and uses of DBNs in real-life applications and scenarios (e.g., electroencephalography, [5] drug discovery [6] [7] [8]).

  9. List of datasets for machine-learning research - Wikipedia

    en.wikipedia.org/wiki/List_of_datasets_for...

    High-quality labeled training datasets for supervised and semi-supervised machine learning algorithms are usually difficult and expensive to produce because of the large amount of time needed to label the data. Although they do not need to be labeled, high-quality datasets for unsupervised learning can also be difficult and costly to produce ...