Hello,
I have two problems related to batch processing:
the first is that for some scenarios, that I run in the search notebook, I can easily find a solution with my student algorithm(search takes around 15 seconds) and I can generate a solution file which is valid. Example, see my uploaded solution for: DEU_Flensburg-86_1_T-1:2020a/
If I try to solve the exact same scenario with the batch processing notebook(both parallel and sequential) it will eat up all memory and after some time return a time out.
I have double checked the configuration file, everything and still didn’t find a hint where to look.
note that the batch processing file works well for all scenarios, except around 40 scenarios when this problem happens.
because of this, I will be generating the solutions manually(of course using the same algorithm) and uploading my solutions to increase my chance for the prize challenge.
the second problem is that the sequential processing file does not consider the time out variable in the configuration file, which means that if the search algorithm gets stuck, it will eat up all memory and do nothing. In contrast to the parallel processing file where after the specified time, the search is killed.
I have looked through both scripts and the difference between them is that, parallel processing is using “process_scenario” to search and sequential processing is using “debug_scenario” which does not have any code about time out.