KPNet: Towards a parameterized implicit 2d cloth rendering

Abstract

The simulation of clothing for a virtual try on is still a challenging task, especially if the customer wants to use state of the art technology. To address this, we employ a 2D plane to process customer images. Specifically, we utilize a neural network, notably an autoencoder, to render so called fashion landmarks. As input we use human keypoints that represent the model poses and our fashion landmarks of the clothing from stock photos to generate fashion landmarks in the desired pose. These can be utilized by additional algorithms to adapt the clothing length or width.

Mehr zum Titel

Titel KPNet: Towards a parameterized implicit 2d cloth rendering
Medien ---
Verlag ---
Heft ---
Band ---
ISBN ---
Verfasser/Herausgeber Bastian Scharnagl, Prof. Dr. Christian Groth
Seiten ---
Veröffentlichungsdatum 12.09.2024
Projekttitel ---
Zitation Scharnagl, Bastian; Groth, Christian (2024): KPNet: Towards a parameterized implicit 2d cloth rendering.