AI to steal Fashion Model jobs, can create Whole-Body images

Called as Generative Adversarial Network (GAN), this new artificial intelligence algorithm can generate high-resolution and photorealistic images of people from scratch which do not exist. This machine learning system can generate superficially authentic photos having realistic characteristics like faces, hair, and outfits.

This is a revolutionary moment for the fashion industry as fashion brands and advertising agencies can use the AI to create photogenic models. They can cut costs on lighting, catering and human resources. This can lead to a situation of joblessness for actual human photo and fashion models.

GAN is created by Datagrid, a technology company based in the campus of the Kyoto University of Japan. This automation whole-body model generation AI can learn through a large number of whole-body model images to generate non-existent images with high resolution (1024 X 1024).

This new AI algorithm is typically being used to create imitations of already existing things like video games levels or images that look like hand-drawn caricatures. The Japanese tech firm has earlier created an idle automatic generation AI in June last year. However, AI was only able to generate images of the face without any expressions.

This latest AI is in line with the firm’s efforts of enhancing the expressive power of the generated person. Datagrid has been working on the research and development of “whole body generation” and “motion generation”, and both are challenging as they do not have any precedent.

Datagrid aims to improve the accuracy of the whole-body model automatic generation AI and also develop the motion generation AI. Additional, the firm will conduct demonstration and training with advertising agencies and fashion brands to develop functions required for actual industrial usage.

Some critics say that GAN could be used for producing fake and incriminating photographs to undermine public trust in digital media. Another similar technology called “Deepfakes” can doctor images and videos. It has been widely used to generate propaganda including fictionalized political speeches and pornography.

The technology is difficult to counter and that too an extent that Canadian Broadcasting Corporation has called it a matter of national security.

SW Staffhttps://startupwonders.com/guestpost
Startup Wonders provide startup resources, stories and news to help budding entrepreneurs grow and succeed in their business career.

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Join 7,000+ Entrepreneurs
 Receive INSPIRING Stories {Weekly} 

Get weekly dose of amazing and inspiring startup stories of entrepreneurs all over the world. Get to know some of the best womenpreneurs and teenpreneurs of earth.
JOIN NOW
close-link

Receive Daily Funding Updates

SUBSCRIBE
Close

 Join 7,000+ Entrepreneurs

SUBSCRIBE
close-link