Loading...
Rama Allen says that human/AI collaboration is an aesthetic dialogue similar to that employed with improvisational jazz. It's a great thinking on what modern art could be.
It allows you to create a digital voice that sounds like you with only one minute of audio.
It automatically selects the most appropriate clip from one of the input takes, for each line of dialogue, based on a user-specified set of film-editing idioms. A final cut is a good draft for an editor.
Flow Machines computer scientists unveiled the first song to be composed by artificial intelligence, the Beatles-esque "Daddy’s Car". See also how Nao Tokui added background noise to Google Street View scenes.
A terrific talk by Maurice Conti at TEDx about algorithms in industrial design, architecture, and wicked problem solving.
Tate Modern and Microsoft collaborated on an exhibition where a machine learning algorithm selected artworks from museum's collection. Philipp Shmitt did a similar experiment with a photo album.
A review of high-tech shoe initiatives by Nike, adidas, New Balance, and Under Armour.
IBM Watson helped 20th Century Fox to create en engaging movie trailer. Looks like they're using this experiment at scale now.
A tool based on the idea of generative design:
1. An algorithm generates many variations of a design using predefined rules and patterns.
2. The results are filtered based on design quality and task requirements.
3. Designers choose the most interesting and adequate variations, polishing them if needed.
It made a lot of noise and prompted several publications from UX gurus. Autodesk built a new Toronto office using these ideas.