[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [creduce-dev] A reduction attempt that creduce handled poorly, where delta was able to make progress
Thanks Prof Regehr! (Your blog is great by the way)
> One thing that might be going wrong is that C-Reduce is running multiple processes in parallel and they are cd-ing to /opt/llvm/whatever and stomping each other. Can I ask you to re-run this reduction while telling C-Reduce to use just one CPU?
So with -n 1 I get much less noisy simultaneous output, and the pass
tracker is constantly visible but still just racks up the failure
counts until it hits 1000 or so of each kind of pass.
... and now I realize what was happening. It was cutting the file down
from about 4000 lines to ~3500 or so, and constantly failing to
compile from then on because it was missing all the includes. My
mistake, I should have been feeding it the preprocessed version of the
file, whereas delta didn't do any preprocessing. creduce is running
now and looking more productive, I'll see where it gets.
> But also I'm not sure I understand the purpose of this reduction. If you're making an LLVM source code file smaller, compiling it, and running it, then aren't you going to get killed by undefined behaviors introduced by the reducer?
Yes I'm sure I will! That's why I'm at least ensuring that the version
built with -fno-ipa-cp gives the exact known-good output for all
builds considered successful. Perhaps this is misguided and my
friendly neighborhood GCC developer won't be able to use any of this
output if it's all UB in the end. And reducing just this single
miscompiled file without modifying any of the other object files that
get linked into the same executable isn't actually making my overall
test case that much smaller or more self-contained. It's just the
first thing I've tried since getting the attention of the GCC dev
whose commit introduced the problem.
-Tony