floyo logo
Powered by
ThinkDiffusion

Image to 3D with Hunyuan3D w/ Texture Upscale

55

Create a 3D model from a reference image, with Flux Dev for texture upscaling.

Key Inputs

Image: Use any JPG or PNG. Load the image you want to generate a 3D asset from, if it has a background this workflow will remove it and center the subject.

Prompt: as descriptive a prompt as possible

Denoise: The amount of variance in the new image. Higher has more variance.

Notes

If you aren’t satisfied with the initial mesh, simply cancel the workflow generation process, preferably before the process reaches the SamplerCustomAdvanced node because applying the textures to the model may take a little bit more time, and you’ll be unable to cancel the generation during that time.

The seed is fixed for the mesh generation, this is so if you need to retry the texture upscale you don't need to also re-generate the mesh. If you would like to try a different seed for a better mesh, simply expand the node below and change the seed to another random number.

Changing the seed could help in some cases, but ultimately the biggest factor is the input image.

If the first mesh isn't showing, give it a moment, there are some additional post processing steps going on in the background for de-light/multiview.

Read more

N
p
pixelworld_ai
1 month ago
test

Reply

Nodes & Models

Create a 3D model from a reference image, with Flux Dev for texture upscaling.

Key Inputs

Image: Use any JPG or PNG. Load the image you want to generate a 3D asset from, if it has a background this workflow will remove it and center the subject.

Prompt: as descriptive a prompt as possible

Denoise: The amount of variance in the new image. Higher has more variance.

Notes

If you aren’t satisfied with the initial mesh, simply cancel the workflow generation process, preferably before the process reaches the SamplerCustomAdvanced node because applying the textures to the model may take a little bit more time, and you’ll be unable to cancel the generation during that time.

The seed is fixed for the mesh generation, this is so if you need to retry the texture upscale you don't need to also re-generate the mesh. If you would like to try a different seed for a better mesh, simply expand the node below and change the seed to another random number.

Changing the seed could help in some cases, but ultimately the biggest factor is the input image.

If the first mesh isn't showing, give it a moment, there are some additional post processing steps going on in the background for de-light/multiview.

Read more

;