Developers from TCS Research India, at the International Conference on Computer Vision (ICCV), presented a technology for virtual clothing fitting called DeepDraper. The development itself is not new, but it was significantly improved against the background of competitors. The results are really impressive.
The technology works as follows. The neural network analyzes uploaded photos or a short video with a person and creates their three-dimensional avatar. Then the user can select any clothing from the digital catalog and virtually try it on a virtual body. Having studied the physical characteristics of the buyer, the self-learning system allows you to predict as accurately as possible how the clothes will look on the 3D avatar, explains 4PDA.
This development can be very useful for online clothing stores: in this way they will be able to attract more customers. One of the authors of the project, Brojeshwar Bhoumik, noted that DeepDraper works great on smartphones and tablets with weak characteristics, and at the same time it is 23 times faster than a similar technology called Tailornet. In addition, it also takes up 10 times less memory than its equivalent.
In the near future, the researchers want to experiment with animation so that they can understand how clothes will look on the human body while in motion, notes NIX Solutions. It will be possible to try on several layers of clothing at once, for example, a blouse and jacket or a T-shirt and jacket.