Hi, I tried git pull and got this error and it aborted.
error: Your local changes to the following files would be overwritten by merge:
notebooks/tutorials/.ipynb_checkpoints/tutorial_commonroad-search-checkpoint.ipynb
notebooks/tutorials/tutorial_commonroad-search.ipynb
Please commit your changes or stash them before you merge.
error: The following untracked working tree files would be overwritten by merge:
GSMP/motion_automata/automata/MotionPlanner_gbfs_only_time.py
pdfs/0_Guide_for_Exercise.pdf
pdfs/1.Brief_Introduction_to_CommonRoad_io.pdf
scenarios/exercise/CHN_Sha-10_2_T-1.xml
scenarios/exercise/CHN_Sha-11_3_T-1.xml
scenarios/exercise/CHN_Sha-11_4_T-1.xml
scenarios/exercise/CHN_Sha-12_2_T-1.xml
scenarios/exercise/CHN_Sha-13_2_T-1.xml
scenarios/exercise/CHN_Sha-14_2_T-1.xml
scenarios/exercise/CHN_Sha-15_3_T-1.xml
scenarios/exercise/CHN_Sha-15_4_T-1.xml
scenarios/exercise/CHN_Sha-16_2_T-1.xml
scenarios/exercise/CHN_Sha-17_2_T-1.xml
scenarios/exercise/CHN_Sha-1_5_T-1.xml
scenarios/exercise/CHN_Sha-1_6_T-1.xml
… (A few more scenarios)
Aborting
So I tried doing git add * ; git stash; but that also did not work as it was suggested by a stack overflow answer… Kindly suggest something.
This is the right way. by git reset you basically discarded all you changes, which made git pull possible in your case. please let me know if you can smoothly work with the newer version of batch processing. thanks!
Yes, it worked. When I run the batch processing new script, it runs on all scenarios. Just wanted to ask that it found 30 solutions out of 300 with A* and gbfs with default config (SM1 cost fn I think). So I do not know how to solve the default 102 solutions!
That was computed based on a 120 seconds time out. Also, the survival scenarios were solved using the new planner (gbfs_only_time). I could solve around 40 with regular gbfs and around 60 with gbfs_only_time, which added up to be 102 for me. (It is also possible to combine the regular gbfs and only time into one planner by inspecting the type of the goal state and calculate heuristics accordingly). The cost functions (SM1, etc) does not affect the search for solution, it is only used to evaluate the performance in the benchmark.
Thanks for the reply. I downloaded the 27/11 vm but facing this error. When I run any ipynb the memory just gets full and seems like there is no option to increase the memory allocated to that virtual machine. Kindly help!!
Also, just one more doubt, when we run tutotial_commonroad_search.ipynb for one scenario, if we find a solution irrespective of the cost, we can term that as 1 of those 110 scenarios right ?
Thanks a lot! One more out of the topic thing ,is there a way we could use some faster desktop in some lab or anywhere as my laptop just hangs most of the time after running a few scenarios ? Tried using Linux and Windows using Docker and VM both !! Thanks
I had a look, only 5% of the time is spent on collision detection. Everything else is trajectory feasibility checking together with optimization, implemented in python. The actual time on a powerful machine is given here.
It could be possible to go down to 5+5 sec per scenario configuring search algorithm parameters.
State-of-the-art algorithms have close-to-realtime performance. Search is search.
The compute room on the ground floor has very slow thin clients.
Use Google Colaboratory instead - limited to 12 hours or less, but powerful CPUs. You can run several instances in parallel. Configure once and upload binaries to git - or use Docker.
Hi, When I run gbfs_only_time, I get 0 solutions with 120 timeout. I latest pulled the on the weekend. Any idea what could go wrong or how to run that particular .py file ?