There is a theoretical computer science paper that is published in a very good conference. The paper essentially comes up with an algorithm and an explicit function (mentioned in the paper) of some 20 parameters, f(r1, ..., r20), which ties the performance guarantee of their algorithm to the value generated by this function.
For example, if they find some values r1, r2, ..., r20 where their f function gives a result of 0.8, then their algorithm performance is at least 0.8.
The problem is that in the paper, the authors claim there exists a set of 20 r values where f(r1, ..., r20) >= 0.8. However, they do not write down explicitly in the paper the 20 r values that they used.
I tried using an optimizer to find 20 values where their stated function might indeed give 0.8 but I cannot find any. The closest I got was 0.74. This is not an improvement over prior algorithms. I emailed the authors 2 weeks ago for the 20 r values that they claim to have used, but they did not reply.
My question is, what options do I have? When should I send another email? If they do not respond, do I have any options? To be frank, I'm surprised that none of the reviewers asked for the 20 parameters that they claim to have used to be written explicitly in the paper.
ffunction is very smooth? Otherwise you could be looking for a needle in a haystack where, maybe, r17 must be between 1.3334323e-9 and 1.3334325e-9 (and the other 19 parameters are similarly arbitrarily constrained) to get to a place where optimization will lead you to the proper optimum. Knowing the r's are between 0 and 1 is great, but knowing that the function is smooth for differences in r_n of 0.001 or less would be even more valuable. – The Photon Jun 08 '23 at 06:05