(Not sure if this is the right place to ask this)
To keep it short: Would it be possible to introduce a filtering process for duplicate solutions (esp. during uploading), e.g. by hashing the files?
For the prize challenge, I need to upload my solutions in batches because of the 1000 file limit. Not only does this already take an eternity, but on a regular basis something crashes mid-upload and I end up with half of the solutions actually uploaded. After retrying, several solutions are now duplicated on the server, supposedly slowing down the lengthy evaluation and publishing procedure even more. In turn, deleting all solutions to start off clean also takes ages (and for some reason massively slows down my browser).
Not sure if Iām missing something here but it looks to me that this could improve the whole process quite a bit.