1
0
Fork 0
mutter-performance-source/tests
Emmanuele Bassi 5f1c8a17e4 Merge branch 'device-manager'
* device-manager: (37 commits)
  x11: Re-enable XI1 extension keyboards
  x11: Always handle core device events before XI events
  docs: Documentation fixes for DeviceManager
  device-manager: Fix the signals definition
  docs: Add sections for InputDevice and DeviceManager
  docs: Add clutter_input_device_get_device_name()
  tests: Print out the device details on motion
  Always register core devices
  device: Remove unused is_default member
  win32: Experimental implementation of device support
  tests: Print the device name, as well as its Id
  x11: Fill out the :name property of the InputDevices
  device: Add the :name property to InputDevice
  x11: Store core devices on the X11 Backend singleton
  device: Unset the cursor actor when leaving the stage
  device: Add pointer actor getter
  x11: Discard the LeaveNotify for off-stage ButtonRelease
  device: Do not overwrite the stage for an InputDevice
  event: Off-stage button releases have a click count of 1
  event: Scroll events do not have click count
  ...
2010-02-01 11:26:56 +00:00
..
conform tests: blend-string: use g_assert_cmpint 2010-01-26 17:25:10 +00:00
data Remove trailing comma from test UI definition 2010-01-05 11:02:39 +00:00
interactive Merge branch 'device-manager' 2010-02-01 11:26:56 +00:00
micro-bench Intial Re-layout of the Cogl source code and introduction of a Cogl Winsys 2009-10-16 18:58:50 +01:00
tools disable-npots: Don't allow the GL version to be 2.0 2009-11-18 17:28:08 +00:00
Makefile.am build: Allow disabling the conformance test suite 2010-02-01 10:40:34 +00:00
README [docs] Fix typos and remove mentions of SVN 2009-07-12 01:38:40 +01:00

Outline of test categories:

The conform/ tests should be non-interactive unit-tests that verify a single feature is behaving as documented. See conform/ADDING_NEW_TESTS for more details.

The micro-bench/ tests should be focused perfomance test, ideally testing a single metric. Please never forget that these tests are synthetec and if you are using them then you understand what metric is being tested. They probably don't reflect any real world application loads and the intention is that you use these tests once you have already determined the crux of your problem and need focused feedback that your changes are indeed improving matters. There is no exit status requirements for these tests, but they should give clear feedback as to their performance. If the framerate is the feedback metric, then the test should forcibly enable FPS debugging.

The interactive/ tests are any tests whos status can not be determined without a user looking at some visual output, or providing some manual input etc. This covers most of the original Clutter tests. Ideally some of these tests will be migrated into the conformance/ directory so they can be used in automated nightly tests.

Other notes:
All tests should ideally include a detailed description in the source explaining exactly what the test is for, how the test was designed to work, and possibly a rationale for the approach taken for testing.