You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Then created a script that duplicates the libraries for a specified amount of frames and adds a time stamp, here is my resulting .json, can you see any issues with it @Alexander0Yang ? does your code accept per camera intrinsics?
Hi @Alexander0Yang, I tested the transforms.json with instantNGP, I dont think the Inria code that can load .json / synthetic blender datasets, that have varied camera intrinsics. we have 5 camera models (varied pixel width and height and lenses) in this dataset
Can you recommend a way to test the .json with any standard gaussian splatting method?
I cant get any clean results from using custom data.
I created a transforms.json by converting my colmap model for one frame using the colmap2nerf script from instant-ngp, this trains great in instant-ngp;
https://github.com/NVlabs/instant-ngp/blob/master/scripts/colmap2nerf.py
Then created a script that duplicates the libraries for a specified amount of frames and adds a time stamp, here is my resulting .json, can you see any issues with it @Alexander0Yang ? does your code accept per camera intrinsics?
transforms_train.json
The text was updated successfully, but these errors were encountered: