If You Can’t Beat Them, Join Them: How Text-to-Image Tools Can Be Leveraged in the 3D Modelling Process

Samuel Otto Mathiesen, Alessandro Canossa

Publications: Chapter in Book/Report/Conference proceedingBook chapterResearchpeer-review

Abstract

The rapidly growing field of text-to-image generation leverages machine learning and natural language processing to allow for the creation of very diverse and high-quality images. Whilst these images can be used in various forms and for many purposes, this paper addresses a specific question: The possibility of utilizing text-to-image generators to create concept art for the 3D-modeling process of a character.
There is a potential opportunity to explore how these tools can be incorporated in traditional content creators’ pipelines. This opportunity presents itself now because of two factors: The increased capability of these text-to-image tools compared to previous generative models, as well as the lack of specific focus on creators in current research.
We propose a possible pipeline to incorporate these new tools in the traditional 3D modeling process, dubbed the ‘vortex process’.
Original languageEnglish
Title of host publicationHCI International 2023 – Late Breaking Papers : 25th International Conference on Human-Computer Interaction, HCII 2023, Copenhagen, Denmark, July 23–28, 2023, Proceedings, Part VI
EditorsHelmut Degen, Stavroula Ntoa, Abbas Moallem
Number of pages19
PublisherSpringer
Publication date23 Jul 2023
Pages162-181
DOIs
Publication statusPublished - 23 Jul 2023
EventHCI International 2023: 25th International Conference on Human-Computer Interaction - AC Bella Sky Hotel and Bella Center, Copenhagen, Denmark
Duration: 23 Jul 202328 Jul 2023
Conference number: 25
https://2023.hci.international/

Conference

ConferenceHCI International 2023
Number25
LocationAC Bella Sky Hotel and Bella Center
Country/TerritoryDenmark
CityCopenhagen
Period23/07/202328/07/2023
Internet address
SeriesLecture Notes in Computer Science
ISSN0302-9743

Artistic research

  • Yes

Cite this