Image to 3D with Hunyuan3D w/ Texture Upscale
Create a 3D model from a reference image with Flux Dev texture upscaling. Key Inputs Image: Use any JPG or PNG. Load the image you want to generate a 3D asset from, if it has a background this workflow will remove it and center the subject. Prompt: as descriptive a prompt as possible Denoise: The amount of variance in the new image. Higher has more variance. Notes: If you aren’t satisfied with the initial mesh, simply cancel the workflow generation process, preferably before the process reaches the SamplerCustomAdvanced node because applying the textures to the model may take a little bit more time, and you’ll be unable to cancel the generation during that time. The seed is fixed for the mesh generation, this is so if you need to retry the texture upscale you don't need to also re-generate the mesh. If you would like to try a different seed for a better mesh, simply expand the node below and change the seed to another random number. Changing the seed could help in some cases, but ultimately the biggest factor is the input image. If the first mesh isn't showing, give it a moment, there are some additional post processing steps going on in the background for de-light/multiview.
3D
Animation
Architecture
Flux
Game Development
Hunyuan 3D
Image to 3D
Upscaling
1
555
Nodes & Models
Hy3DModelLoader
hunyuan3d-dit-v2-0-fp16.safetensors
Hy3DCameraConfig
DownloadAndLoadHy3DPaintModel
DownloadAndLoadHy3DDelightModel
Hy3DDiffusersSchedulerConfig
Hy3DGenerateMesh
Hy3DVAEDecode
Hy3DPostprocessMesh
Hy3DDelightImage
Hy3DExportMesh
Hy3DMeshUVWrap
Hy3DRenderMultiView
Hy3DSampleMultiView
Hy3DBakeFromMultiview
Hy3DMeshVerticeInpaintTexture
CV2InpaintTexture
Hy3DApplyTexture
SolidMask
RandomNoise
KSamplerSelect
Note
Label (rgthree)
LoadImage
MaskToImage
CLIPTextEncode
Reroute
FluxGuidance
ImageCompositeMasked
Preview3D
PreviewImage
VAEEncode
ModelSamplingFlux
BasicScheduler
BasicGuider
SamplerCustomAdvanced
VAEDecode
Text Multiline
Text Concatenate
TransparentBGSession+
ImageResize+
ImageRemoveBackground+
ImageResizeKJ
ImageResizeKJ
ImageResizeKJ
Create a 3D model from a reference image, with Flux Dev for texture upscaling.
Key Inputs
Image: Use any JPG or PNG. Load the image you want to generate a 3D asset from, if it has a background this workflow will remove it and center the subject.
Prompt: as descriptive a prompt as possible
Denoise: The amount of variance in the new image. Higher has more variance.
Notes
If you aren’t satisfied with the initial mesh, simply cancel the workflow generation process, preferably before the process reaches the SamplerCustomAdvanced node because applying the textures to the model may take a little bit more time, and you’ll be unable to cancel the generation during that time.
The seed is fixed for the mesh generation, this is so if you need to retry the texture upscale you don't need to also re-generate the mesh. If you would like to try a different seed for a better mesh, simply expand the node below and change the seed to another random number.
Changing the seed could help in some cases, but ultimately the biggest factor is the input image.
If the first mesh isn't showing, give it a moment, there are some additional post processing steps going on in the background for de-light/multiview.
Read more
0
Reply
