A recent study from the University of California, San Diego, in collaboration with Adobe Research showed how AI and neural networks can have the ability to one day create custom designs for clothing and apparel in order to help apparel designers and retailers sell clothing based on buyer and consumer preferences learnt through these networks.
The model used by the researchers show how the technology can be utilized generatively, meaning it can generate a new menu of related images that are highly consistent with a user’s personal taste given only the user’s profile and the product category they are seeking out. This is a significant first step toward developing systems that go far beyond recommending items within a business’ existing selection, to helping design new products by suggesting styles based on a client’s preferences. Their findings were recently published on ArXiv in early November in a paper entitled, “Visually-Aware Fashion Recommendation and Design with Generative Image Models. Julian McAuley, an Engineering and Computer Science professor along with Ph.D. student Wang-Cheng Kang collaborated on the research with Zhaowen Wang and Chen Fang, industry experts from Adobe Research.
Their research suggests brand new technology that is able to generate a better recommendation approach for recommendation, design, and production. Their suggested frameworks are believed to have the ability to lead to even richer forms of recommendation where content generation and recommendation are closely linked together.
The project intended to test how well AI and machine learning tools can help both ends of the industry—fashion designers/producers and consumers—to have a more collaborative approach to producing and consuming fashion. There are already all kinds of tools and algorithms that help online retailers make design and product recommendations to consumers, but the research team went a step further by looking more deeply into their purchase behaviors in order to reflect individual preferences of consumers.