Personal Site of Ying Huang @New New Studio {Art&Technology} {Augmented Reality} {Mixed Reality} {Generative Art} {Creative Coding} {Female Artist} 

Al Knitwear


Photo credit: @anthonyespinostudio


Post-graduate Fellowship project
Time: July 2019 - Dec 2019
Part of the summer research project SAMPLER led by Elise Co (The  earlier AI pattern research part)

Responsibility and Workflow:

- Concept development : Created a series of knitwear that encodes algorithmic messages generated by machine learning to illustrate a brand new knitted fashion aesthetic.
- Research :  Researched and experimented on different machine learning algorithm to identify the most suitable one; researched on the current mechanism and limitations of machine knitting, speculated the direction of future of knitting.
- Coding and prototyping :  Integrated generative machine learning algorithms on custom knitting pattern datasets; built a program to facilitate this new way of knitting.
- Making : Knitted the swatches on Brother KH940, Brother Bulky both single beds and double beds using different techniques. Used the swatch to make a finished garment.
- Post-production : Photography and branding for dissemination and documentation of the AI knitwear fashion line.

INTRO 
AI knitwear explores the possibility of integrating machine learning algorithms with traditional knitting techniques by examining the impact of a machine generated patterns and aesthetics on our everyday wearables – garments. Due to the glitchy nature of the AI generative patterns, the project is a challenge to current knitting techniques in both mechanisms and aesthetics. 


BACKGROUND & RESEARCH

1. Knitting Machines:
How a knitting machine looks like and how it produces fabric(you can add a motor to the carriage or hand push it to knit): Image copyright:(From top left to bottom) Goodey’s knitting toys, OurMakerLife Blog, GIPHY, Claire Williams
 
Limitations of the current home knitting machine:
a. Width: Limited to the width of the knitting machine bed.
b. MultiColor:Limited to 2 colors for single bed knitting, 4 colors for double bed knitting. For home knitting machines even with a color changer integrated 4 color is usually the maximum. The more color that is used the more floats in the back which results in an uneven surface for single bed fabric, or a more stiff surface for double bed fabric, which is bad for a fabric. 
c. Customized pattern: Not really accessible; it takes lots of time and effort, or money.
d. Intarsia:  A technique used to knit multi-color pattern work. However, it can only apply a block of color instead of single scattered pixels of color. (see image below).
e. Open-source knitting machine hacking programs(e.g. AYAB): only support up to 6 colors for one pattern and with substandard performance for knitting 6 colors.


Left: blocks of color V.S. Right: scattered pixels of colors  

Left: blocks of color(image from @annieleelarson) V.S. Right: scattered pixels of colors
Opportunities for innovation:
a. A multicolor changer that supports unlimited number of colors (may need disassemble and reassemble).
b. A program that connects with the machine and supports unlimited number of colors.
c. A machine/ carriage that knits scattered multicolor work without tons of floats at the back (single bed) and/or makes the fabric very stretchy or stiff (double bed).
 

2. Knitting patterns &AI algorithms:
what a knitting pattern chart looks like:

A knitting pattern chart is simply pixel colors on a grid. One square represents one stitch on the swatch. If the swatch is bigger than the dimension of the chart, for example, the swatch is 100 needles wide by 100 rows, but the pattern chart is 10 by 10, the pattern will repeat itself 10 times per row. With the pattern chart, a knitter can knit the charts with multicolor or a structural pattern such as lace. 

Example (images from Internet):
2-color checkerboard swatch V.S. checkerboard lace pattern swatch

From the example above, we can see that using different knitting techniques, one pattern chart can produce very different fabrics. This is one of the things that I am fascinated by knitting, it’s like computer graphics and after effects, but instead of digital post-production, this is a physical post-production. With different combination of yarns, gauge, technique, single or double side fabric, you have the ability to create unlimited physical effects on the same digital image.

After I spent days collecting various knitting pattern charts for my AI training dataset, I built a fundamental understanding of those patterns and started to categorize them.

︎Traditional patterns
     

︎Figurative patterns
     

︎Geometric patterns
     

︎Abstract patterns
     
Traditional patterns are usually those we see often on a sweater from the 90s while figurative and geometric patterns are very much self-explanatory. I found abstract patterns are the most interesting and have the most potential to be surprising, because they are usually unexpected. Like machine learning algorithms you never know what features the algorithm decides to pick up and why they “think” the result it generates is as real as the human generated one.

After gaining enough knowledge and hands-on knitting experience, I am moved to pattern generation, analysis, design, and final product production.


ALGORITHM & GENERATIVE PATTERNS

For knitting pattern generation, I used an algorithm called GAN(Generative Adversarial Network). The algorithm learns the feature like color, texture, composition of the input images, mimic it, then output a set of images that it thinks is as real as the real input.
I experimented with 3 different GAN algorithms and 5 different datasets, here are some results generated by algorithms:
1. Trained with multi-colored pattern charts

2. Trained with only black and white pattern charts

It’s easy to see that the second set of result is more “successful” than the first set. The first set is trying to get the gridline, structure, and color while the second set has clear gridlines and pixel blocks. This is because the second set has more “clean” data to learn from. It requires more images and time for the machine to learn if it’s a colored dataset. However, what caught my eye is the first set of result. The intermediate generative result or the one consider “collapse” is a perfect representation of how the machine neural networks is wired, the machine generative aesthetic. In fact, I got some similar results in my last project working with other GAN models, but these glitchy color only shows in part of the image instead of the whole image. Therefore, I want to learn more about these machine glitchy results, the message encrypted in it and its implication to us: knit these glitchy images out, make them into garments/wearables, use, wear, and test them in real life context.



MAKING

︎ PRE-PRODUCTION
     Work with algorithms to generate desired patterns:
     
     ︎Choose machine learning algorithm model
     ︎Prepare dataset
     ︎Training
     ︎Test the result


︎ PRODUCTION  
Adjust the generative result to make it suitable for machine knitting:

︎Down-sample the result:
Due to the limitation of multi-color knitting(usually 2-4 color for machine knitting), I down-sampled the pattern to 10 colors to start with(the least number of colors that still retains the original glitchy aesthetic).

         
Left(generative result) V.S Right(down-sampled image)


     

︎Replace color based on yarn choice:
Based on colors of the existing yarns, replace certain colors on the image to simulate the final product color palette, for example, replace grey with light purple.
 
︎Make it “machine readable”:
Since there are no free or even low cost program that supports 10 color pattern for the machine, I have to figure out a way to make the knitting process less painful. I wrote a simple program in Processing to make it easier for me to translate the pattern from the computer to the knitting machine:

Demo of the program:
The program separates color for the pattern and indicates the row and needle position for each pixel, which saves me lots of time on counting and memorizing the pattern.


︎ POST-PRODUCTION

To knit these three pieces I showed in the beginning of this document, I used three different techniques, they all are  traditional knitting techniques, but because of the glitchy pattern and number of colors I am using, I had to do it differently:

1. Single-bed fairisle ︎ All hand manipulated, caused uneven tension surface. Long floats at the back
2. Single-bed intarsia ︎ Semi-hand manipulated, a more even surface than the first one. Long floats at the back, less than the first one tho.
3. Double-bed jacquard ︎ Machine knit. No floats. Among these three technique, I prefer multicolor intarsia than the other two. Using intarsia is a good way to avoid uneven tension when hand manipulation is needed. Though double-bed jacquard solves this problem and the float problem, because too many colors were used, the pattern is elongated and you can see the back side from the front. 
The video below documents the knitting process of the second piece, using single-bed intarsia.
Technique and equipment : Single bed 9-color intarsia on both Brother standard and bulky machine
Time: 40 hours of knitting + 10 hours of sewing
 


knitted wearable objects
Internet of Knitted wearables

BRANDING
Coming soon......

SPECULATION & REFLECTION

By applying those generative patterns to knitting, the garment become the physical container of the digital message from the machine. Through the process of pattern making, swatch production, and lifestyle integration, the knitter becomes the person who works with the image instead of on the image, translating and encoding the message into the wearable. This process creates a network of communication as well as a unique lifestyle.

Knitted pattern performance:
Because the pattern is very dense and has a lot of features, it makes a good medium for AR pattern recognition (tested on Vuforia). This opens up possibilities for Mixed Reality communication/experience with knitwear. A connecting knitted wearable ecosystem could be created in public and private, bridging smart knitwear/knit objects with our digital world.

Knitting mechanisms:
As I mentioned in the research session, there is room to improve the machine mechanism to make multicolor knitting easier. Though I did some modification to the knitting technique and program, it’s not enough to make a big difference. There’s possibility to pair up with engineers and mechanics to make this improvement. In fact, CMU Textile Lab’ project, KnitOut, an on-demand knitting system is already innovating the knitting design pipeline which makes it possible to integrate all knitting techniques even including 3D knitting.


 

           ︎︎︎Back to top             yunyinghhh[at]gmail[dot]com  



  APPLIED WORK  


2023

Commericial Work


2022

Artist Collaboration


2022

Commericial Work


2021 

Artist Collaboration


2017

Commercial Work


2016

Commercial Work


2016

Artist Collaboration




 RESEARCH / SPECULATIVE 


2020

Artist Research


2019

Artist Research


2018

Artist Research


2018

Artist Research




 EXPERIMENTS


2019

Play 4 Fun


2019

Play 4 Fun






💜 💙 🧡 💚 💛 ❤️ 💜 💙 🧡 💚


Ying is a media artist, creative technologist, and multimedia designer, her work range from AR/VR Technology, Generative AI/ML Art, Visual Art, Video Art, Creative Coding, and Interaction Design.
Her work has been featured in It’s Nice that, Hypebeast China, and Wallpaper. Her Installation has been exhibited in NeurIPS 2019, PRIMER19(New York), Die Digitale 2020, and more. She taught as an adjunct faculty member in the Transmedia department and as a guest lecturer in the Media Design Practices department at ArtCenter College of Design.
She is now the founder and art director of The New New Studio.


Ying硕士毕业于ArtCenter College of Design,作为数字媒体艺术家及设计师,她专注于数字媒介(AR、 互动媒体视觉、人工智能艺术、生成艺术)创作,其作品收录于It’s Nice that(英国),Hypebeast China及卷宗Wallpaper,艺术装置于机器学习顶尖学术会议NeurIPS 2019、全球思辨设计大会PRIMER19(纽约)、数字媒体艺术双年展Die Digitale等展出,媒体研究成果于媒体研究期刊Screen Bodies发表。曾任ArtCenter College of Design跨媒体方向讲师及媒体实践专业的客座讲师。
现在,她是创意设计工作室New New Studio的主理人及创意艺术总监。

︎READ MORE

Mark