Sökning: "SGD"
Visar resultat 1 - 5 av 26 uppsatser innehållade ordet SGD.
1. Variational AutoEncoders and Differential Privacy : balancing data synthesis and privacy constraints
Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)Sammanfattning : This thesis investigates the effectiveness of Tabular Variational Auto Encoders (TVAEs) in generating high-quality synthetic tabular data and assesses their compliance with differential privacy principles. The study shows that while TVAEs are better than VAEs at generating synthetic data that faithfully reproduces the distribution of real data as measured by the Synthetic Data Vault (SDV) metrics, the latter does not guarantee that the synthetic data is up to the task in practical industrial applications. LÄS MER
2. Exploring State-of-the-Art Machine Learning Methods for Quantifying Exercise-induced Muscle Fatigue
Uppsats för yrkesexamina på avancerad nivå, Högskolan i Halmstad/Akademin för informationsteknologiSammanfattning : Muscle fatigue is a severe problem for elite athletes, and this is due to the long resting times, which can vary. Various mechanisms can cause muscle fatigue which signifies that the specific muscle has reached its maximum force and cannot continue the task. LÄS MER
3. Implementering av Skogliga Grunddata i hprGallring vid andragallring
Kandidat-uppsats, SLU/School for Forest ManagementSammanfattning : Gallring är en av de skogsvårdsåtgärder som utförs i våra skogar för att bemöta de olika skogliga mål som skogsägarna har för sin skog. I Sveriges skogar så utförs gallringsåtgärder på ca 400 000 hektar årligen. LÄS MER
4. Stochastic Frank-Wolfe Algorithm : Uniform Sampling Without Replacement
Master-uppsats, Umeå universitet/Institutionen för matematik och matematisk statistikSammanfattning : The Frank-Wolfe (FW) optimization algorithm, due to its projection free property, has gained popularity in recent years with typical application within the field of machine learning. In the stochastic setting, it is still relatively understudied in comparison to the more expensive projected method of Stochastic Gradient Descent (SGD). LÄS MER
5. On the Modelling of Stochastic Gradient Descent with Stochastic Differential Equations
Master-uppsats, Uppsala universitet/Analys och partiella differentialekvationerSammanfattning : Stochastic gradient descent (SGD) is arguably the most important algorithm used in optimization problems for large-scale machine learning. Its behaviour has been studied extensively from the viewpoint of mathematical analysis and probability theory; it is widely held that in the limit where the learning rate in the algorithm tends to zero, a specific stochastic differential equation becomes an adequate model of the dynamics of the algorithm. LÄS MER