Skip to content

Larger RAM usage with the new config system #159

@barikata1984

Description

@barikata1984

Description

Hi,

I tried to run main_nerf.py in the main branch. But it suddenly stopped showing a one-word line Killed. It is presumably due to RAM shortage, according to google. I checked the usage and it reached its limit immediately before the app stopped. Do you have any idea how to deal with this issue?

I followed all the installation procedures, including requirements_app.txt. main_nerf.py in the stable branch works without any problems. So, if the config system is the only major change between the main and stable branches, the issue should be caused by the new config system. I suppose you can reproduce the larger RAM usage in your environment.

I installed pyopengl_accelerate separately because a msg telling the module is missing appeared when I ran the stable main_nerf.py for the first time, but the conda env should be clean to run wisp apps.

I know the easiest solution is increasing RAM. But the stable config system works fine even with limited RAM. It would be great if I could also use the new one on the same machine since it looks much cleaner.

Thanks in advance!

Machine spec

  • OS: Ubuntu 22.04 on WSL2 on Windows 11 22H2
  • RAM: 16 GB (approx. 8 GB for WSL2)
  • GPU: RTX 4070 Ti
  • Cuda: 11.7
  • Torch: 1.13.1
  • Kaolin: 0.13.0

Reproduction steps

  1. Install Kaolin Wisp with requirements_app.txt
  2. pip install pyopengl_accelerate
  3. python app/nerf/main_nerf.py --dataset-path /path/to/lego/ --config app/nerf/configs/nerf_hash.yaml

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions