Sökning: "Convex optimization"

Visar resultat 1 - 5 av 83 uppsatser innehållade orden Convex optimization.

  1. 1. On Linear Mode Connectivity up to Permutation of Hidden Neurons in Neural Network : When does Weight Averaging work?

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Adhithyan Kalaivanan; [2023]
    Nyckelord :Mode Connectivity; Representation Learning; Loss Landscape; Network Symmetry; Lägesanslutning; representationsinlärning; förlustlandskap; nätverkssymmetri;

    Sammanfattning : Neural networks trained using gradient-based optimization methods exhibit a surprising phenomenon known as mode connectivity, where two independently trained network weights are not isolated low loss minima in the parameter space. Instead, they can be connected by simple curves along which the loss remains low. LÄS MER

  2. 2. Optimization of Radiotherapy Treatment Plans Based on Monte Carlo Dose Computations

    Uppsats för yrkesexamina på avancerad nivå, Lunds universitet/Institutionen för reglerteknik

    Författare :Ludvig Håkansson; [2023]
    Nyckelord :Technology and Engineering;

    Sammanfattning : Treatment planning plays a vital role in providing good treatment to cancer patients. In order to reach an adequate treatment plan, the algorithm used for simulating the dose in the patient must model the reality well. LÄS MER

  3. 3. Stochastic Frank-Wolfe Algorithm : Uniform Sampling Without Replacement

    Master-uppsats, Umeå universitet/Institutionen för matematik och matematisk statistik

    Författare :Olof Håkman; [2023]
    Nyckelord :Stochastic Frank-Wolfe; Stochastic optimization; Sampling without replacement;

    Sammanfattning : The Frank-Wolfe (FW) optimization algorithm, due to its projection free property, has gained popularity in recent years with typical application within the field of machine learning. In the stochastic setting, it is still relatively understudied in comparison to the more expensive projected method of Stochastic Gradient Descent (SGD). LÄS MER

  4. 4. Optimizing First-Order Method Parameters via Differentiation of the Performance Estimation Problem

    Uppsats för yrkesexamina på avancerad nivå, Lunds universitet/Institutionen för reglerteknik

    Författare :Anton Åkerman; [2023]
    Nyckelord :Technology and Engineering;

    Sammanfattning : This thesis treats the problem of finding optimal parameters for first-order optimization methods. In part, we use the Performance Estimation Problem (PEP), a framework for convergence analysis of first-order optimization methods. LÄS MER

  5. 5. Adjoint optimization of a liquid-cooled heat sink

    Master-uppsats, KTH/Teknisk mekanik

    Författare :Roven Pinto; [2023]
    Nyckelord :Adjoint method; topology optimization; porosity; OpenFOAM;

    Sammanfattning : Improving the design of flow channels in a liquid-cooled heat sink is critical for boosting the capabilities of electronic components as well as reducing energy usage by the pump. This work explores the use of topology optimization to minimize the pressure difference across a heat sink and consequently, the energy used to supply the liquid. LÄS MER