Sökning: "Normalizing Flows"

Visar resultat 1 - 5 av 9 uppsatser innehållade orden Normalizing Flows.

  1. 1. Exploring Normalizing Flow Modifications for Improved Model Expressivity

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Marcel Juschak; [2023]
    Nyckelord :Normalizing Flows; Motion Synthesis; Invertible Neural Networks; Glow; MoGlow; Maximum Likelihood Estimation; Generative models; normaliserande flöden; rörelsesyntes; inverterbara neurala nätverk; Glow; MoGlow; maximum likelihood-skattning generativa modeller;

    Sammanfattning : Normalizing flows represent a class of generative models that exhibit a number of attractive properties, but do not always achieve state-of-the-art performance when it comes to perceived naturalness of generated samples. To improve the quality of generated samples, this thesis examines methods to enhance the expressivity of discrete-time normalizing flow models and thus their ability to capture different aspects of the data. LÄS MER

  2. 2. Probabilistic Forecasting through Reformer Conditioned Normalizing Flows

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Samuel Norling; [2022]
    Nyckelord :;

    Sammanfattning : Forecasts are essential for human decision-making in several fields, such as weather forecasts, retail prices, or stock predictions. Recently the Transformer neural network, commonly used for sequence-to-sequence tasks, has shown great potential in achieving state-of-the-art forecasting results when combined with density estimations models such as Autoregressive Flows. LÄS MER

  3. 3. Comparison of Discriminative and Generative Image Classifiers

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Simon Budh; William Grip; [2022]
    Nyckelord :Image classification; CNN; Normalizing flows; RealNVP; Adversarial examples;

    Sammanfattning : In this report a discriminative and a generative image classifier, used for classification of images with handwritten digits from zero to nine, are compared. The aim of this project was to compare the accuracy of the two classifiers in absence and presence of perturbations to the images. LÄS MER

  4. 4. Analyzing the Negative Log-Likelihood Loss in Generative Modeling

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Aleix Espuña I Fontcuberta; [2022]
    Nyckelord :Generative modeling; Normalizing flows; Generative Adversarial Networks; MaximumLikelihood Estimation; Real Non-Volume Preserving flow; Fréchet Inception Distance; Misspecification; Generativa metoder; Normalizing flows; Generative adversarial networks; Maximum likelihood-metoden; Real non-volume preserving flow; Fréchet inception distance; felspecificerade modeller;

    Sammanfattning : Maximum-Likelihood Estimation (MLE) is a classic model-fitting method from probability theory. However, it has been argued repeatedly that MLE is inappropriate for synthesis applications, since its priorities are at odds with important principles of human perception, and that, e.g. LÄS MER

  5. 5. The Impact of Noise on Generative and Discriminative Image Classifiers

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Maximilian Stenlund; Valdemar Jakobsson; [2022]
    Nyckelord :Artificial intelligence; Adversarial noise; Discriminative; Generative; Salt and Pepper noise; Gaussian noise; neural networks; Normalized flows; Convolutional networks;

    Sammanfattning : This report analyzes the difference between discriminative and generative image classifiers when tested on noise. The generative classifier was a maximum-likelihood based classifier using a normalizing flow as the generative model. In this work, a coupling flow such as RealNVP was used. LÄS MER