summaryrefslogtreecommitdiffstats
path: root/base
diff options
context:
space:
mode:
authormbelshe@chromium.org <mbelshe@chromium.org@0039d316-1c4b-4281-b951-d872f2087c98>2010-07-12 15:13:29 +0000
committermbelshe@chromium.org <mbelshe@chromium.org@0039d316-1c4b-4281-b951-d872f2087c98>2010-07-12 15:13:29 +0000
commit2da248c998a02b0162542e98ee9b4d7a6dd562b3 (patch)
treeee4e62c4d9cab3c83339d7fa97ad9113d2e2aa07 /base
parentfc7de49e356bc0b2961170713583904a6c248a55 (diff)
downloadchromium_src-2da248c998a02b0162542e98ee9b4d7a6dd562b3.zip
chromium_src-2da248c998a02b0162542e98ee9b4d7a6dd562b3.tar.gz
chromium_src-2da248c998a02b0162542e98ee9b4d7a6dd562b3.tar.bz2
Fix flakeyness in the timer test.
There were two problems. First, I changed the API so you need to activate high resolution timers, and that was not getting called. So if nobody else was activating them, the timers were definitely not high resolution anymore. Secondarily, the test was too aggressive. We set the resolution to 1ms (via timeBeginPeriod on windows), which means that we could get up to ~1.5 * 1ms of slop. Without the change, I was getting failures in about 1/800 runs. With the new value (8500ms instead of 9000ms), I get zero failures in 5000 runs before I stopped trying. BUG=none TEST=self Review URL: http://codereview.chromium.org/2927005 git-svn-id: svn://svn.chromium.org/chrome/trunk/src@52080 0039d316-1c4b-4281-b951-d872f2087c98
Diffstat (limited to 'base')
-rw-r--r--base/time_unittest.cc9
1 files changed, 8 insertions, 1 deletions
diff --git a/base/time_unittest.cc b/base/time_unittest.cc
index 76b357a..2950d74 100644
--- a/base/time_unittest.cc
+++ b/base/time_unittest.cc
@@ -118,11 +118,18 @@ TEST(TimeTicks, Deltas) {
}
TEST(TimeTicks, HighResNow) {
+#if defined(OS_WIN)
+ Time::ActivateHighResolutionTimer(true);
+#endif
+
TimeTicks ticks_start = TimeTicks::HighResNow();
PlatformThread::Sleep(10);
TimeTicks ticks_stop = TimeTicks::HighResNow();
TimeDelta delta = ticks_stop - ticks_start;
- EXPECT_GE(delta.InMicroseconds(), 9000);
+
+ // In high res mode, we should be accurate to within 1.5x our
+ // best granularity. On windows, that is 1ms, so use 1.5ms.
+ EXPECT_GE(delta.InMicroseconds(), 8500);
}
TEST(TimeDelta, FromAndIn) {