When I was executing tutorial_batch_processing.ipynb, the program first ran well, and quickly gave out the senarios solved and unsolved. However, this program seemed to be unstopable even after it already gave out all the solutions. It used up all the memory of my DDR and made the jupyter notebook website crashed. Even when I closed all the terminal and internet browser, the python program still keept runnig (there were more than four python progresses running together, each taking few gigabytes of RAM) , and I was not able to kill them.
The operating system I’m using is Ubuntu 20.04.5 LTS, RAM:46.8 GiB, CPU:12th Gen Intel® Core™ i9-12900K × 24, GPU: NVIDIA GeForce RTX 3080 Ti, archtecture: x86_64
Could you tell me which function in the tutorial (run_parallel_processing or run_seqential_processing) with which planner in SMP/batch_processing/batch_processing_config.yaml are you executing when program crashed?
Please note that basicly run_parallel_processing is the one you need. run_seqential_processing is only for debugging scenario and currently does not work good with some classes of search algorithm (should only not work with some depth-limited search I think).
thanks for bring this to our attention. We will work on it.
Could you also please provide a screenshot or list of pip list when the commonroad-py37 env is activated. Thus we could check whether it is a issue related with wrong version of packages.
According to your pip list, the packages you installed for this exercise look correct. But several other stuff are also installed in the same environment, for example, ROS-related packages. I would recommand you to set up a fresh conda env for this exercise. You can follow the README in repository to create a new env.
When you set up a new env, please verify the package version of Shapely according to this thread.