Journal of Textile Research ›› 2023, Vol. 44 ›› Issue (09): 84-90.doi: 10.13475/j.fzxb.20220504801

• Textile Engineering • Previous Articles     Next Articles

Embroidery style transfer modeling based on multi-scale texture synthesis

YAO Linhan1, ZHANG Ying1, YAO Lan1, ZHENG Xiaoping2, WEI Wenda3, LIU Chengxia1,4,5()   

  1. 1. School of Fashion Design & Engineering,Zhejiang Sci-Tech University, Hangzhou, Zhejiang 310018, China
    2. China Textile Engineering Society, Beijing 100025,China
    3. China Textile Planning Institute of Construction, Beijing 100125, China
    4. Key Laboratory of Silk Culture Heritage and Product Design Digital Technology, Hangzhou, Zhejiang 310018, China
    5. Zhejiang Province Engineering Laboratory of Clothing Digital Technology, Hangzhou, Zhejiang 310018, China
  • Received:2022-05-16 Revised:2022-12-21 Online:2023-09-15 Published:2023-10-30

Abstract:

Objective There are still many problems in the existing embroidery image generation algorithms, such as singular style of the generated image, rough features, and many artifacts. Aiming at this situation, embroidery style transfer modeling based on multi-scale texture synthesis is proposed. By improving the existing style transfer algorithm, embroidery-style images with higher perceptual quality are expected to be created.

Method Multi-stylized loss function was adopted to extract the edge information and detailed structure of the content image and the style image, and the results were compared with the effect of the transfer image generated by only using the content loss and the style loss.

Results Experiments presults showed that the multi-stylizd loss function could generate embroidery migration images with clear texture. Gatys' model was shown to have more artifacts and deformations in the conversion process of the image with dense lines, and Johnson's model also had artifacts and conversion errors in the conversion of feather details. Li and Wand's model lacks embroidery details such as stitching and texture variations (Fig. 6). The multi-scale texture synthesis embroidery style transfer model (MTE-NST) proposed in this research transfered the detailed structure of the style map well, which was closer to the real embroidery work, and was better than the first three models in terms of style and details. MTE-NST had the smallest MSE (the smallest style loss) and the smallest LPIPS (the highest image perceptual similarity), and the image quality and transfer effect were better. The test time and occupied memory were second only to Johnson, which are 0.58 s, 3 900 MB, respectively which are quite close, further verifying that MTE-NST can generate more realistic embroidery style image (Tab. 2).

Conclusion This paper proposes a MTE-NST, which learns hierarchically the multi-scale embroidery art styles. MTE-NST can not only restore the style image color but also preserve the texture structure and fine details of the image edge, which solves the problem of texture conversion mismatch and can generate embroidery style transfer pictures with better visual effects.

Key words: style transfer, embroidery simulation, deep neural network, non-realistic rendering, Laplacian loss

CLC Number: 

  • TS195.644

Fig. 1

Embroidery style transfer model based on multi-scale texture synthesis"

Fig. 2

Images with completely different mean and variance. (a) Uniformly distributed gray image; (b) Nonuniformly distributed image"

Fig. 3

Texture synthesis effect of different statistical data. (a) Input picture; (b) Average activation; (c) Histogram loss; (d) Gram loss; (e) Joint loss"

Tab. 1

Definition of training parameters"

参数 含义 数值
α 内容损失权重 0.01
β 风格损失权重 1
γ 拉普拉斯损失 0.001
Epoch 迭代次数 3 000
Lr 学习率 0.001

Fig. 4

Generation process of image edge"

Fig. 5

Comparison of two groups of transfer result images and edge detection images. (a) Sample 1; (b) Sample 2"

Fig. 6

Comparison of experimental results of different models"

Tab. 2

Comparison results of each model applied to 256 pixel×256 pixel image"

模型类型 MSE LPIPS 测试时间/s 占用内存/MB
文献[13] 5.81 0.664 38.34 5 900
文献[14] 2.67 0.517 0.55 3 800
文献[15] 1.61 0.193 0.62 4 600
MTE-NST 1.43 0.175 0.58 3 900
[1] JING Y, YANG Y, FENG Z, et al. Neural style transfer: a review[J]. IEEE Transactions on Visualization and Computer Graphics, 2019, 26(11): 3365-3385.
doi: 10.1109/TVCG.2945
[2] ZHANG W, CAO C, CHEN S, et al. Style transfer via image component analysis[J]. IEEE Transactions on multimedia, 2013, 15(7): 1594-1601.
doi: 10.1109/TMM.2013.2265675
[3] DONG Y, TAN W, TAO D, et al. CartoonLoss GAN: learning surface and coloring of images for cartoon ization[J]. IEEE Transactions on Image Processing, 2021, 31: 485-498.
doi: 10.1109/TIP.2021.3130539
[4] ZHANG S, GAO X, WANG N, et al. Robust face sketch style synthesis[J]. IEEE Transactions on Image Processing, 2015, 25(1): 220-232.
doi: 10.1109/TIP.2015.2501755
[5] FU F, LV J, TANG C, et al. Multi-style Chinese art painting generation of flowers[J]. IET Image Processing, 2021, 15(3): 746-762.
doi: 10.1049/ipr2.v15.3
[6] 郑锐, 钱文华, 徐丹, 等. 基于卷积神经网络的刺绣风格数字合成[J]. 浙江大学学报, 2019, 46(3): 270-278.
ZHENG Rui, QIAN Wenhua, XU Dan, et al Digital synthesis of embroidery style based on convolutional neural network[J] Journal of Zhejiang University, 2019, 46 (3): 270-278.
[7] QIAN W, CAO J, XU D, et al. CNN-based embroidery style rendering[J]. International Journal of PRAI, 2020, 34(14): 205-225.
[8] BEG M A, YU J Y. Generating embroidery patterns using image-to-image translation[EB/OL]. (2020-03-05) [2022-04-14]. https://arxiv.org/abs/2003.02909.
[9] ZHU J Y, PARK T, ISOLA P, et al. Unpaired image to image translation using cycle consistent adversarial networks[C]// 2017 Proceedings of the ICCV. Venice: IEEE, 2017: 2223-2232.
[10] GUAN X, LUO L, LI H, et al. Automatic embroidery texture synthesis for garment design and display online[J]. The Visual Computer, 2021, 37(9): 2553-2565.
doi: 10.1007/s00371-021-02216-0
[11] LI S, XU X, NIE L, et al. Laplacian-steered neural style transfer[C]// Proceeding of the 25th ACM International Conference on Multimedia. New York: Association for Computing Machinery, 2017: 1716-1724.
[12] RISSER E, WILMOT P, BARNES C. Stable and controllable neural texture synthesis and style transfer using histogram losses[EB/OL]. (2017-01-31) [2022-04-14] https://arxiv.org/abs/1701.08893.
[13] GATYS L, ECHER A, BETHGE M. Neural algorithm of artistic style[J]. Journal of Vision, 2015, 16(12):326-336.
doi: 10.1167/16.12.326
[14] ULYANOV D, LEBEDEV V, VEDALDI A, et al. Texture networks: feed-forward synthesis of textures and stylized images[C]// International Conference on Machine Learning. New York: ACM, 2016:1349-1357.
[15] LI C, WAND M. Combining markov random fields and convolutional neural networks for image synthesis[C]// 2016 Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016:2479-2486.
[1] CHEN Jinwen, WANG Xin, LUO Weihao, MEI Chennan, WEI Jingyan, ZHONG Yueqi. VR-oriented personalized head and face texture generation technology of dressed human body [J]. Journal of Textile Research, 2023, 44(09): 188-196.
[2] LIU Feng, XU Jie, KE Wenbo. Real-time dynamic scheduling for garment sewing process based on deep reinforcement learning [J]. Journal of Textile Research, 2022, 43(09): 41-48.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!