Skip to content

Unexpected ComplexGen Reconstruction Results – Am I Missing Something? #19

@atntjt

Description

@atntjt

Thank you for this great project. I’m currently trying it out and have some questions because the results I’m getting don’t look correct to me. I’m wondering if I made a mistake somewhere along the way. Here’s my process:

Files: workspace.zip


1. Point Cloud Data

I placed the point cloud file base_aufbereitet_Res256 - Kopie.ply in the folder /workspace/scripts/ (included in the ZIP archive).


2. Increasing the Point Count to 10,000

To increase the number of points to 10,000, I used the script denisty_ply_utf8.py in /workspace/scripts/ with the following content (also in the ZIP archive):

import open3d as o3d
import numpy as np

# Load the point cloud
pcd = o3d.io.read_point_cloud("base_aufbereitet_Res256 - Kopie.ply")

# Optional: Calculate normals if not present
if not pcd.has_normals():
    pcd.estimate_normals(search_param=o3d.geometry.KDTreeSearchParamHybrid(radius=0.01, max_nn=30))

# Surface reconstruction (Poisson)
mesh, densities = o3d.geometry.TriangleMesh.create_from_point_cloud_poisson(pcd, depth=8)

# Sample a dense point cloud from the mesh
pcd_dense = mesh.sample_points_poisson_disk(number_of_points=10000)

# Save the point cloud
o3d.io.write_point_cloud("dichte_punktwolke_open3d.ply", pcd_dense)

After running:

/workspace/scripts# python denisty_ply_utf8.py

I got the file dichte_punktwolke_open3d.ply (included in the ZIP). Checking in Meshlab shows 10,000 points.


3. Normalizing to [-0.5, 0.5]^3

Next, I ran meineNormalisierung_utf8.py (included in the ZIP) to normalize the points to [-0.5, 0.5]^3, with the following content:

import open3d as o3d
import numpy as np
import pickle
import numpy as np

# Paths according to ComplexGen structure
PATH_TEMPLATE_PKL = "/workspace/data/train_small/packed/packed_000000.pkl"
PATH_OUTPUT_PKL = "/workspace/data/default/test/packed/meinInput.pkl"

PATH_OUTPUT_PLY = "/workspace/data/default/test_point_clouds/00000006_0_10000.ply"

pcd = o3d.io.read_point_cloud("dichte_punktwolke_open3d.ply")

points_np = np.asarray(pcd.points)   # Nx3
normals_np = np.asarray(pcd.normals) # Nx3

# 2) Normalize to [-0.5, 0.5]^3
min_xyz = points_np.min(axis=0)
max_xyz = points_np.max(axis=0)
center = 0.5 * (min_xyz + max_xyz)
extent = (max_xyz - min_xyz).max()

pts_centered = points_np - center
pts_normed   = pts_centered / extent

# 3) Nx6: (x,y,z,nx,ny,nz)
surface_points = np.hstack([pts_normed, normals_np])  # shape (N,6)

#################################
# Part A: Creating .pkl
#################################

# Load template
with open(PATH_TEMPLATE_PKL, "rb") as fin:
    sample = pickle.load(fin)

# surface_points Nx6
sample['surface_points'] = surface_points

# Write new file meinInput.pkl
with open(PATH_OUTPUT_PKL, "wb") as fout:
    pickle.dump(sample, fout)

print(f"Done: {PATH_OUTPUT_PKL} created and surface_points replaced.")

#################################
# Part B: ASCII-PLY
#################################

pcd_out = o3d.geometry.PointCloud()
pcd_out.points  = o3d.utility.Vector3dVector(pts_normed)
pcd_out.normals = o3d.utility.Vector3dVector(normals_np)

# ASCII format
o3d.io.write_point_cloud(
    PATH_OUTPUT_PLY,
    pcd_out,
    write_ascii=True
)

print(f"Done: ASCII-PLY {PATH_OUTPUT_PLY} written.")

After running:

/workspace/scripts# python meineNormalisierung_utf8.py

I got the following files (in the ZIP archive):

  • /workspace/data/default/test/packed/meinInput.pkl
  • /workspace/data/default/test_point_clouds/00000006_0_10000.ply

4. Choosing and Running a Test Script

We can choose from test_default.sh, test_noise002.sh, test_noise005.sh, or test_partial.sh. I previously tried test_default.sh, test_noise002.sh, and test_partial.sh, but none of them seemed to produce a correct result. Therefore, I decided to use test_noise005.sh. In that file, I also changed --gpu 0 to --gpu 0,1,2,3,4.

Execute with:

/workspace# ./scripts/test_noise005.sh

This generated new files in the folder /workspace/experiments/noise005/test_obj/ (in the ZIP).

1


5. Running extraction_noise005.sh

Then, I ran the script extraction_noise005.sh.

  • Before running it, I modified gurobi_wrapper.py under /workspace/PostProcess/:
    • max_time = 5*60 to max_time = 120*60 (otherwise the gap remains high)
    • m.setParam('Threads', 2) to m.setParam('Threads', 24) (not sure if it helps)

Execute with:

/workspace# ./scripts/extraction_noise005.sh

After 2 hours, (which seems quite long?) Gap of 0.58%. The new files in /workspace/experiments/noise005/test_obj/ (in the ZIP) are:

  • /workspace/experiments/noise005/test_obj/00000006_0_prediction.complex
  • /workspace/experiments/noise005/test_obj/00000006_0_extraction.complex

2


6. Visualizing .complex Files

Now I want to visualize 00000006_0_prediction.complex and 00000006_0_extraction.complex. According to the instructions, I first copy complexgen.mtl:

/workspace# cp /workspace/vis/complexgen.mtl /workspace/experiments/noise005/test_obj/complexgen.mtl

Then:

/workspace# cd vis
/workspace/vis# python gen_vis_result.py -i /workspace/experiments/noise005/test_obj/00000006_0_prediction.complex
/workspace/vis# python gen_vis_result.py -i /workspace/experiments/noise005/test_obj/00000006_0_extraction.complex
/workspace# cd ..

This produces:

  • /workspace/experiments/noise005/test_obj/00000006_0_extraction.obj
  • /workspace/experiments/noise005/test_obj/00000006_0_prediction.obj

00000006_0_extraction.obj
3_extraction

00000006_0_prediction.obj
3_prediction


7. Running geometric_refine.py --noise

Next, I run geometric_refine.py --noise (because it’s recommended for noisy/partial data). Before that, I updated these paths in the script:

exe_path = r'./GeometricRefine/Bin/GeometricRefine'
pc_ply_path = r'./data/default/test_point_clouds'

# complex_path = r'./experiments/default/test_obj'
# complex_path = r'./experiments/partial/test_obj'
# complex_path = r'./experiments/noise002/test_obj'

complex_path = r'./experiments/noise005/test_obj'

Execute with:

/workspace# python ./scripts/geometric_refine.py --noise

The script generates more files (in the ZIP):

  • /workspace/experiments/noise005/test_obj/00000006_0_geom_refine.json
  • /workspace/experiments/noise005/test_obj/00000006_0_extraction_input.json

Then I visualize them with gen_vis_result.py:

/workspace# cd vis
/workspace/vis# python gen_vis_result.py -i /workspace/experiments/noise005/test_obj/00000006_0_geom_refine.json
/workspace/vis# python gen_vis_result.py -i /workspace/experiments/noise005/test_obj/00000006_0_extraction_input.json
/workspace# cd ..

This produces:

  • /workspace/experiments/noise005/test_obj/00000006_0_geom_refine.obj
  • /workspace/experiments/noise005/test_obj/00000006_0_extraction_input.obj

00000006_0_geom_refine.obj
4_geom_refine

00000006_0_extraction_input.obj
4_extraction_input


8. Question / Observation

I’m wondering if I might have overlooked something in my process. The point cloud doesn’t seem particularly complex, so I was hoping ComplexGen could handle and visualize it easily. I successfully created the example 00861558_0_10000.ply (included in the ZIP under ComplexGen_Example).

I’d really appreciate any feedback, tips, or suggestions on how to solve this.
Thank you again for this great project!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions