Sökning: "Generative Pre-training"

Visar resultat 1 - 5 av 6 uppsatser innehållade orden Generative Pre-training.

  1. 1. Text-Driven Fashion Image Manipulation with GANs : A case study in full-body human image manipulation in fashion

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Reza Dadfar; [2023]
    Nyckelord :Multimodal fashion image editing; Generative adversarial network inversion; Text-driven image manipulation; TD-GEM; Multimodal modebildredigering; Generativa adverserial Nätverk inversion; Text-driven bildmanipulation; TD-GEM;

    Sammanfattning : Language-based fashion image editing has promising applications in design, sustainability, and art. However, it is considered a challenging problem in computer vision and graphics. The diversity of human poses and the complexity of clothing shapes and textures make the editing problem difficult. LÄS MER

  2. 2. Fine-tuning Bot Play Styles From Demonstration

    Master-uppsats, Uppsala universitet/Institutionen för informationsteknologi

    Författare :Felicia Fredriksson; [2023]
    Nyckelord :;

    Sammanfattning : In recent years, Reinforcement Learning (RL) has successfully been used to train agents for games. Nonetheless, in the game industry there is still a necessity for bots not only to succeed in the environments but also to act human-like while playing the game. LÄS MER

  3. 3. Enhanced Experience Generation for Reinforcement Learning Pre-training in Telecommunication Systems

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Tianxiao Zhao; [2020]
    Nyckelord :;

    Sammanfattning : In recent years, the rise of Reinforcement Learning (RL) in robotics and games has attracted growing attention from different industries, such as telecommunications. One novel attempt to apply reinforcement learning within telecommunications is to train an RL agent for auto-scaling Virtualized Network Functions (VNFs) in a core network environment. LÄS MER

  4. 4. Response Generation Using Large-scale Pre-trained Language Models

    Uppsats för yrkesexamina på avancerad nivå,

    Författare :Jakob Nyberg; [2020]
    Nyckelord :Språkmodeller; Maskininlärning;

    Sammanfattning : In this project I studied how generative neural language models can be used for response generation. The purpose of the model is to generate responses for a social robot, instead of having responses be authored and evaluated by crowd-sourced workers. LÄS MER

  5. 5. Employing a Transformer Language Model for Information Retrieval and Document Classification : Using OpenAI's generative pre-trained transformer, GPT-2

    Master-uppsats, KTH/Skolan för elektroteknik och datavetenskap (EECS)

    Författare :Anton Bjöörn; [2020]
    Nyckelord :Deep Learning; Transformer Models; Information Retrieval; Ranking; Generative Pre-training; Document Classification; djupinlärning; transformermodeller; informationssökning; ranking; generativ förträning; dokumentklassificering;

    Sammanfattning : As the information flow on the Internet keeps growing it becomes increasingly easy to miss important news which does not have a mass appeal. Combating this problem calls for increasingly sophisticated information retrieval methods. LÄS MER