Sökning: "Synthetic likelihood"

Visar resultat 1 - 5 av 10 uppsatser innehållade orden Synthetic likelihood.

  1. 1. Exploring Normalizing Flow Modifications for Improved Model Expressivity

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Marcel Juschak; [2023]
    Nyckelord :Normalizing Flows; Motion Synthesis; Invertible Neural Networks; Glow; MoGlow; Maximum Likelihood Estimation; Generative models; normaliserande flöden; rörelsesyntes; inverterbara neurala nätverk; Glow; MoGlow; maximum likelihood-skattning generativa modeller;

    Sammanfattning : Normalizing flows represent a class of generative models that exhibit a number of attractive properties, but do not always achieve state-of-the-art performance when it comes to perceived naturalness of generated samples. To improve the quality of generated samples, this thesis examines methods to enhance the expressivity of discrete-time normalizing flow models and thus their ability to capture different aspects of the data. LÄS MER

  2. 2. Application of Bootstrap in Approximate Bayesian Computation (ABC)

    Master-uppsats, Uppsala universitet/Statistik, AI och data science

    Författare :Ellinor Nyman; [2023]
    Nyckelord :ABC; Bootstrap; Computer intensive methods; Bayesian statistics; Linear regression;

    Sammanfattning : The ABC algorithm is a Bayesian method which simulates samples from the posterior distribution. In this thesis, the method is applied on both synthetic and observed data of a regression model. Under normal error distribution a conjugate prior and the likelihood function are used in the algorithm. LÄS MER

  3. 3. Investigating Relations between Regularization and Weight Initialization in Artificial Neural Networks

    Kandidat-uppsats, Lunds universitet/Beräkningsbiologi och biologisk fysik - Genomgår omorganisation

    Författare :Rasmus Sjöö; [2022]
    Nyckelord :Artificial Neural Networks; L1 Regularization; L2 Regularization; Loss Function; Maximum Likelihood; Regularization Strength Synthetic Data Generation; Weight Initialization; Physics and Astronomy;

    Sammanfattning : L2 regularization is a common method used to prevent overtraining in artificial neural networks. However, an issue with this method is that the regularization strength has to be properly adjusted for it to work as intended. This value is usually found by trial and error which can take some time, especially for larger networks. LÄS MER

  4. 4. Spatiotemporal PET reconstruction with Learned Registration

    Kandidat-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Pierre Meyrat; [2022]
    Nyckelord :PET; Tomographic reconstruction; Deep Learning; MLEM; PET; Tomografisk rekonstruktion; Deep Learning; MLEM;

    Sammanfattning : Because of the long acquisition time of Positron Emission Tomography scanners, the reconstructed images are blurred by motion. We hereby propose a novel motion-correction maximum-likelihood expectation-maximization algorithm integrating 3D movements between the different gates estimated by a neural network trained on synthetic data with contrast invariance. LÄS MER

  5. 5. Analyzing the Negative Log-Likelihood Loss in Generative Modeling

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Aleix Espuña I Fontcuberta; [2022]
    Nyckelord :Generative modeling; Normalizing flows; Generative Adversarial Networks; MaximumLikelihood Estimation; Real Non-Volume Preserving flow; Fréchet Inception Distance; Misspecification; Generativa metoder; Normalizing flows; Generative adversarial networks; Maximum likelihood-metoden; Real non-volume preserving flow; Fréchet inception distance; felspecificerade modeller;

    Sammanfattning : Maximum-Likelihood Estimation (MLE) is a classic model-fitting method from probability theory. However, it has been argued repeatedly that MLE is inappropriate for synthesis applications, since its priorities are at odds with important principles of human perception, and that, e.g. LÄS MER