The past weekend the Meshcon took place in Berlin. As a conference that puts together fashion designer and technology experts, it makes for a nice and explosive cocktail. It is interesting to see how many interactions there are between these two fields. In fact, computers are entering the fashion field from multiple directions, such as:
The finished product:
With wearables devices, we have sensors, motors, buzzers, LEDs and all kind of electronic stuff just embedded into your clothes. Possibly they communicate with your mobile so that you can interact with them. I liked the idea of design-research-lab to make some hats that can give you directions, by buzzing. (the brown hat)
It is needless to point out how important it is to have freedom-compliant wearable devices. On the one hand in order to control the data they produce and share; sensible, private or biometric – we always want to have them under control as much as possible. On the other hand to provide the possibility to fix and improve your clothes.
The designing process:
It was interesting to see how the designing and prototyping process of clothes can be done easier and more efficiently using 3D graphics tools and even 3D printing. Unfortunately it was pointed out how most of the solution in this field are developed and distributed as proprietary software. Perhaps someone could fork Blender and make it more fashion friendly?
There has been a lot of talking about knitting and technology. Something that I find very fascinating is the ability to modify old production machines in order to be controlled by a modern PC. This is has been done by AYAB project, all documentation is available as Open Source Hardware and Free Software. Similarly Embroidermodder project allows you to draw on your computer and to embroider your creation on tissues. One day it would be nice to draw some patterns on my laptop and make a sewing machine execute them, so that anyone can be a tailor. Keep up the good job!
The shopping part:
I am not very familiar with online clothes shopping, but from what I understood the challenge is to make people buy clothes without wearing them first. So here solutions are presented as some kind of virtual dressing room where you can set your belly size and render an image showing how the stuff you are buying fits you.
My favorite talk:
One of the most innovative ideas I heard was Pedro Lopes presentation of his Muscle-Propelled Force Feedback project, the aim is to make a symmetric computer gesture communication. It results in a computer outputting information by moving your body through electro-stimulation, and you inputting information with an accelerometers mounted on your hand.
From a policy liability point of view it is rather scary to imagine someone hacking in your device and moving your body without your consent…
Attribution for the images:
Brown Hat: http://www.design-research-lab.org/projects/wearable-m2m/
Crazy Helmet: http://spectrum.ieee.org/geek-life/profiles/steve-mann-my-augmediated-life