What sort of events could end up consuming 'sys' time on OSX and not on
Linux?
A program of mine that 'computes', on OS/X, shows a time like:
real 0m10.883s
user 0m6.924s
sys 0m3.957s
On a nearby Linux system, in contrast, I see:
real 0m7.480s
user 0m7.172s
sys 0m0.280s
To make matters worse, this situation arrived after rewriting one
particular algorithm, and neither the new nor the old does any obvious
system-call-ish stuff.
Some poking around with dapptrace and iprofiler failed to turn up
anything. This is all 10.8.2, xcode 4.2. The code in question is C++.
No comments:
Post a Comment