纺织学报 ›› 2023, Vol. 44 ›› Issue (09): 84-90.doi: 10.13475/j.fzxb.20220504801

• 纺织工程 • 上一篇    下一篇

基于多尺度纹理合成的刺绣风格迁移模型

姚琳涵1, 张颖1, 姚岚1, 郑晓萍2, 魏文达3, 刘成霞1,4,5()   

  1. 1.浙江理工大学 服装学院, 浙江 杭州 310018
    2.中国纺织工程学会, 北京 100025
    3.中国纺织建设规划院, 北京 100125
    4.丝绸文化传承与产品设计数字化技术文化和旅游部重点实验室, 浙江 杭州 310018
    5.服装数字化技术浙江省工程实验室, 浙江 杭州 310018
  • 收稿日期:2022-05-16 修回日期:2022-12-21 出版日期:2023-09-15 发布日期:2023-10-30
  • 通讯作者: 刘成霞(1975—),女,教授,博士。主要研究方向为服装数字化。E-mail:glorior_liu@163.com
  • 作者简介:姚琳涵(1998—),女,硕士生。主要研究方向为服装数字化、服装图像处理。
  • 基金资助:
    国家自然科学基金项目(51405446);浙江省自然科学基金项目(LY20E050017);浙江省大学生科技创新活动计划暨新苗人才计划项目(2020R406084)

Embroidery style transfer modeling based on multi-scale texture synthesis

YAO Linhan1, ZHANG Ying1, YAO Lan1, ZHENG Xiaoping2, WEI Wenda3, LIU Chengxia1,4,5()   

  1. 1. School of Fashion Design & Engineering,Zhejiang Sci-Tech University, Hangzhou, Zhejiang 310018, China
    2. China Textile Engineering Society, Beijing 100025,China
    3. China Textile Planning Institute of Construction, Beijing 100125, China
    4. Key Laboratory of Silk Culture Heritage and Product Design Digital Technology, Hangzhou, Zhejiang 310018, China
    5. Zhejiang Province Engineering Laboratory of Clothing Digital Technology, Hangzhou, Zhejiang 310018, China
  • Received:2022-05-16 Revised:2022-12-21 Published:2023-09-15 Online:2023-10-30

摘要:

针对现有的刺绣风格模拟算法产生的图像细节不够精确,缺乏语义深度等缺点,提出了一种多尺度纹理合成的刺绣风格迁移模型(MTE-NST),该模型主要由生成网络和损失网络2部分组成,其中生成网络又包含内容匹配模块、结构增强模块和纹理精细模块。并通过引入多程式损失联合训练,分层迭代优化刺绣迁移图像,减少各个损失项对迁移效果的影响。结果表明:与现有卷积神经网络风格迁移算法对比,MTE-NST能生成更清晰的刺绣织线纹理和多方向的针脚轨迹,显著减少图片匹配错误产生的伪影,生成更逼真的刺绣艺术作品,本文研究结果有助于提高刺绣产品的外观仿真设计水平,促进刺绣技艺的发展及创新。

关键词: 风格迁移, 刺绣作品, 深度神经网络, 非真实渲染, 拉普拉斯损失

Abstract:

Objective There are still many problems in the existing embroidery image generation algorithms, such as singular style of the generated image, rough features, and many artifacts. Aiming at this situation, embroidery style transfer modeling based on multi-scale texture synthesis is proposed. By improving the existing style transfer algorithm, embroidery-style images with higher perceptual quality are expected to be created.

Method Multi-stylized loss function was adopted to extract the edge information and detailed structure of the content image and the style image, and the results were compared with the effect of the transfer image generated by only using the content loss and the style loss.

Results Experiments presults showed that the multi-stylizd loss function could generate embroidery migration images with clear texture. Gatys' model was shown to have more artifacts and deformations in the conversion process of the image with dense lines, and Johnson's model also had artifacts and conversion errors in the conversion of feather details. Li and Wand's model lacks embroidery details such as stitching and texture variations (Fig. 6). The multi-scale texture synthesis embroidery style transfer model (MTE-NST) proposed in this research transfered the detailed structure of the style map well, which was closer to the real embroidery work, and was better than the first three models in terms of style and details. MTE-NST had the smallest MSE (the smallest style loss) and the smallest LPIPS (the highest image perceptual similarity), and the image quality and transfer effect were better. The test time and occupied memory were second only to Johnson, which are 0.58 s, 3 900 MB, respectively which are quite close, further verifying that MTE-NST can generate more realistic embroidery style image (Tab. 2).

Conclusion This paper proposes a MTE-NST, which learns hierarchically the multi-scale embroidery art styles. MTE-NST can not only restore the style image color but also preserve the texture structure and fine details of the image edge, which solves the problem of texture conversion mismatch and can generate embroidery style transfer pictures with better visual effects.

Key words: style transfer, embroidery simulation, deep neural network, non-realistic rendering, Laplacian loss

中图分类号: 

  • TS195.644

图1

多尺度纹理合成的刺绣风格迁移模型"

图2

具有不同均值和方差的图像"

图3

不同统计数据的纹理合成效果"

表1

训练参数定义"

参数 含义 数值
α 内容损失权重 0.01
β 风格损失权重 1
γ 拉普拉斯损失 0.001
Epoch 迭代次数 3 000
Lr 学习率 0.001

图4

图像边缘生成流程"

图5

2组样本风格迁移图像和边缘检测图像的对比"

图6

不同模型实验结果对比"

表2

用于256像素×256像素图像的比较结果"

模型类型 MSE LPIPS 测试时间/s 占用内存/MB
文献[13] 5.81 0.664 38.34 5 900
文献[14] 2.67 0.517 0.55 3 800
文献[15] 1.61 0.193 0.62 4 600
MTE-NST 1.43 0.175 0.58 3 900
[1] JING Y, YANG Y, FENG Z, et al. Neural style transfer: a review[J]. IEEE Transactions on Visualization and Computer Graphics, 2019, 26(11): 3365-3385.
doi: 10.1109/TVCG.2945
[2] ZHANG W, CAO C, CHEN S, et al. Style transfer via image component analysis[J]. IEEE Transactions on multimedia, 2013, 15(7): 1594-1601.
doi: 10.1109/TMM.2013.2265675
[3] DONG Y, TAN W, TAO D, et al. CartoonLoss GAN: learning surface and coloring of images for cartoon ization[J]. IEEE Transactions on Image Processing, 2021, 31: 485-498.
doi: 10.1109/TIP.2021.3130539
[4] ZHANG S, GAO X, WANG N, et al. Robust face sketch style synthesis[J]. IEEE Transactions on Image Processing, 2015, 25(1): 220-232.
doi: 10.1109/TIP.2015.2501755
[5] FU F, LV J, TANG C, et al. Multi-style Chinese art painting generation of flowers[J]. IET Image Processing, 2021, 15(3): 746-762.
doi: 10.1049/ipr2.v15.3
[6] 郑锐, 钱文华, 徐丹, 等. 基于卷积神经网络的刺绣风格数字合成[J]. 浙江大学学报, 2019, 46(3): 270-278.
ZHENG Rui, QIAN Wenhua, XU Dan, et al Digital synthesis of embroidery style based on convolutional neural network[J] Journal of Zhejiang University, 2019, 46 (3): 270-278.
[7] QIAN W, CAO J, XU D, et al. CNN-based embroidery style rendering[J]. International Journal of PRAI, 2020, 34(14): 205-225.
[8] BEG M A, YU J Y. Generating embroidery patterns using image-to-image translation[EB/OL]. (2020-03-05) [2022-04-14]. https://arxiv.org/abs/2003.02909.
[9] ZHU J Y, PARK T, ISOLA P, et al. Unpaired image to image translation using cycle consistent adversarial networks[C]// 2017 Proceedings of the ICCV. Venice: IEEE, 2017: 2223-2232.
[10] GUAN X, LUO L, LI H, et al. Automatic embroidery texture synthesis for garment design and display online[J]. The Visual Computer, 2021, 37(9): 2553-2565.
doi: 10.1007/s00371-021-02216-0
[11] LI S, XU X, NIE L, et al. Laplacian-steered neural style transfer[C]// Proceeding of the 25th ACM International Conference on Multimedia. New York: Association for Computing Machinery, 2017: 1716-1724.
[12] RISSER E, WILMOT P, BARNES C. Stable and controllable neural texture synthesis and style transfer using histogram losses[EB/OL]. (2017-01-31) [2022-04-14] https://arxiv.org/abs/1701.08893.
[13] GATYS L, ECHER A, BETHGE M. Neural algorithm of artistic style[J]. Journal of Vision, 2015, 16(12):326-336.
doi: 10.1167/16.12.326
[14] ULYANOV D, LEBEDEV V, VEDALDI A, et al. Texture networks: feed-forward synthesis of textures and stylized images[C]// International Conference on Machine Learning. New York: ACM, 2016:1349-1357.
[15] LI C, WAND M. Combining markov random fields and convolutional neural networks for image synthesis[C]// 2016 Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016:2479-2486.
[1] 陈金文, 王鑫, 罗炜豪, 梅琛楠, 韦京艳, 钟跃崎. 面向虚拟现实的着装人体个性化头面部纹理生成技术[J]. 纺织学报, 2023, 44(09): 188-196.
[2] 刘锋, 徐杰, 柯文博. 基于深度强化学习的服装缝制过程实时动态调度[J]. 纺织学报, 2022, 43(09): 41-48.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!