European RDM Plugfest Report
Simon Newton
4/10/2012 – 5/10/2012
I recently attended the RDM plugfest in Gatwick and had the opportunity to run the RDM Responder Tests on diverse set of responders. At the suggestion of Peter Kirkup, I’ve run some analysis on the data in an attempt to understand how the quality of RDM implementations is progressing. The data presented here has been aggregated and anonymized so that test results for any specific manufacture aren’t identifiable. The tests themselves do not perform exhaustive timing checks so even if a responder achieves a perfect score it may not fully comply with the RDM standard.
The procedure involved running the RDM Responder Tests (git SHA1 id 58fb444b9342ea6d9ba70ecdda8f720cc88692ff for those who are interested) on a Macbook Pro with a Robe Universal Interface. During the course of testing I found two bugs in the RDM tests which caused the test script to crash. I fixed both bugs and since these bugs caused crashes rather than false positives, the bugs would not have affected the results of the responders tested prior to the bugs being found.
In total 27 responders from 20 manufacturers were tested. 7 of the 27 responders received perfect scores, which is the highest result ever for a plugfest. At previous plugfests only two responders have managed a similar result. Congratulations to Cooper Controls – Zero 88, Creative Lighting, Howard Eaton Lighting Limited, LumenRadio & SUMOLIGHT GMBH. The test scores for each responder is shown in Table 1.
Responder Name | Tests Passed | Tests Run | Pass Rate (%) |
---|---|---|---|
Responder 1 | 231 | 231 | 100.0 |
Responder 2 | 231 | 231 | 100.0 |
Responder 3 | 229 | 229 | 100.0 |
Responder 4 | 232 | 232 | 100.0 |
Responder 5 | 233 | 233 | 100.0 |
Responder 6 | 229 | 229 | 100.0 |
Responder 7 | 229 | 229 | 100.0 |
Responder 8 | 229 | 232 | 98.7 |
Responder 9 | 226 | 231 | 97.8 |
Responder 10 | 224 | 230 | 97.4 |
Responder 11 | 226 | 233 | 97.0 |
Responder 12 | 223 | 231 | 96.5 |
Responder 13 | 224 | 233 | 96.1 |
Responder 14 | 217 | 228 | 95.2 |
Responder 15 | 220 | 233 | 94.4 |
Responder 16 | 214 | 229 | 93.4 |
Responder 17 | 205 | 220 | 93.2 |
Responder 18 | 214 | 234 | 91.5 |
Responder 19 | 206 | 228 | 90.4 |
Responder 20 | 185 | 209 | 88.5 |
Responder 21 | 185 | 209 | 88.5 |
Responder 22 | 187 | 222 | 84.2 |
Responder 23 | 191 | 231 | 82.7 |
Responder 24 | 62 | 204 | 30.4 |
Responder 25 | 37 | 154 | 24.0 |
Responder 26 | 37 | 154 | 24.0 |
Responder 27 | 31 | 190 | 16.3 |
Table 1. Summary of results by responder
The median score is 95.2% and the median number of tests run is 229 out of 235 tests. Note that not every test is run for each responder because there is often a dependency chain between two or more tests. For example the tests need to know if a PID is supported before they know what to expect when sending a GET command for that PID. If a test that other tests depend on fails, then the number of tests run will be reduced as is the case for the last responders in the table.
Each test is assigned to a category, which roughly corresponds to the RDM Categories/Parameter ID Defines table in the E1.20 document. When broken down by category, the results are seen in Table 2.
Test Category | Pass Rate (%) |
---|---|
Core Functionality | 95.7 |
Display Settings | 91.3 |
Status Collection | 90.7 |
Power / Lamp Settings | 90.5 |
Network Management | 90.4 |
Sensors | 89.3 |
DMX512 Setup | 89.2 |
Configuration | 88.4 |
Control | 86.9 |
Product Information | 86.9 |
Dimmer Settings | 84 |
Error Conditions | 82.4 |
RDM Information | 72.2 |
Sub Devices | 54.8 |
Table 2. Per Category Pass Rates.
The RDM Responder Tests also produce warnings to indicate responder behavior which, while not serious enough to cause a failure, should still be correctly handled. Five responders produced no warnings. The maximum warnings produced was 261 and the median was 10.
Finally Table 3 shows the pass results for each test. This provides a useful insight into which areas of an RDM implementation are most often overlooked. Ignoring the tests which cover error conditions, the most common failure is the SetPersonality, followed by ResetFactoryDefaults & GetParamDescription.
Pass Rate (%) | # of Tests | Test Names |
---|---|---|
100 | 29 | CheckSensorConsistency, DUBFullTree, DUBInvertedFullTree, DUBInvertedRange, DUBManufacturerTree, DUBNegativeUpperBound, DUBSignedComparisons, DUBSingleUpperUID, GetDeviceInfoWithData, GetManufacturerLabel, GetSoftwareVersionLabel, GetSoftwareVersionLabelWithData, GetSupportedParameters, ProxiedDevicesControlField, SetBroadcastStartAddress, SetBurnIn, SetDMXBlockAddress, SetDevicePowerCyclesWithNoData, SetDisplayInvert, SetDisplayLevel, SetIdentifyMode, SetLampState, SetLanguage, SetPanInvert, SetPanTiltSwap, SetPowerOnSelfTest, SetPowerState, SetUnsupportedLanguage, SetVendorcastStartAddress |
96 | 2 | ClearStatusMessages, ClearStatusMessagesWithData |
95.7 | 2 | GetSlotDescriptions, GetStartAddress |
95.5 | 2 | GetPersonalityDescriptions, GetSensorValues |
95.2 | 17 | DUBAffirmativeLowerBound, DUBAffirmativeUpperBound, DUBDifferentManufacturer, DUBInvertedLowerUID, DUBInvertedUpperUID, DUBNegativeLowerBound, DUBNegativeVendorcast, DUBSingleLowerUID, DUBSingleUID, GetDefaultSlotValues, GetProxiedDeviceCount, RequestsWhileUnmuted, ResetDevicePowerCycles, SetDeviceLabel, SetDevicePowerCycles, SetLampHours, SetLampStrikesWithNoData |
95 | 1 | SetVendorcastDeviceLabel |
93.8 | 1 | RecordAllSensorValues |
92 | 23 | ClearCommsStatus, FindSelfTests, GetDeviceHoursWithData, GetDeviceInfo, GetDeviceModelDescriptionWithData, GetDisplayInvertWithData, GetLampHoursWithData, GetLampOnMode, GetLampStrikes, GetManufacturerLabelWithData, GetPresetMergeModeWithData, GetProxiedDeviceCountWithData, GetProxiedDevicesWithData, GetTiltInvert, GetTiltInvertWithData, MuteDevice, SetBurnInWithNoData, SetPanInvertWithNoData, SetParamDescription, SetProxiedDevices, SetSlotDescription, SetSlotInfo, SetTiltInvertWithNoData |
91.3 | 7 | GetPersonality, GetPersonalityDescription, GetSensorDefinition, SetLampOnMode, SetLampStrikes, SetTiltInvert, UnMuteDevice |
90.9 | 2 | RecordSensorValues, RecordUndefinedSensorValues |
90.5 | 8 | DUBPositiveVendorcast, SetBroadcastIdentifyDevice, SetDeviceHours, SetFullSizeDeviceLabel, SetIdentifyDevice, SetNonAsciiDeviceLabel, SetVendorcastIdentifyDevice, SubDeviceControlField |
90 | 2 | SetBroadcastDeviceLabel, SetLampHoursWithNoData |
88 | 9 | GetDeviceModelDescription, GetFactoryDefaults, GetFactoryDefaultsWithData, GetRecordSensors, GetSupportedParametersWithData, InvalidDiscoveryPID, RecordSensorValueWithNoData, SetPerformSelfTestWithNoData, SetPowerStateWithNoData |
86.4 | 1 | GetUndefinedSensorValues |
85.7 | 3 | DUBPositiveUnicast, SetEmptyDeviceLabel, SetOtherVendorcastIdentifyDevice |
84.2 | 1 | SetDeviceHoursWithNoData |
84 | 68 | ClearCommsStatusWithData, GetBootSoftwareLabel, GetBootSoftwareLabelWithData, GetBootSoftwareVersion, GetBootSoftwareVersionWithData, GetBurnIn, GetBurnInWithData, GetCapturePreset, GetCommsStatus, GetCommsStatusWithData, GetDMXBlockAddress, GetDMXBlockAddressWithData, GetDeviceHours, GetDeviceLabel, GetDeviceLabelWithData, GetDevicePowerCycles, GetDevicePowerCyclesWithData, GetDimmerInfo, GetDimmerInfoWithData, GetDisplayInvert, GetDisplayLevel, GetDisplayLevelWithData, GetIdentifyDevice, GetIdentifyDeviceWithData, GetIdentifyMode, GetIdentifyModeWithData, GetLampHours, GetLampOnModeWithData, GetLampState, GetLampStateWithData, GetLampStrikesWithData, GetLanguage, GetPanInvert, GetPanInvertWithData, GetPanTiltSwap, GetPanTiltSwapWithData, GetPowerOnSelfTest, GetPowerOnSelfTestWithData, GetPowerState, GetPowerStateWithData, GetPresetMergeMode, GetPresetPlayback, GetPresetPlaybackWithData, GetProductDetailIdList, GetProductDetailIdListWithData, GetProxiedDevices, GetRealTimeClock, GetRealTimeClockWithData, GetSelfTestDescription, GetSelfTestDescriptionWithNoData, GetSlotInfo, SetBootSoftwareLabel, SetBootSoftwareVersion, SetCapturePresetWithNoData, SetDMXBlockAddressWithNoData, SetDefaultSlotInfo, SetDimmerInfo, SetLampOnModeWithNoData, SetLampStateWithNoData, SetPanTiltSwapWithNoData, SetPowerOnSelfTestWithNoData, SetPresetPlayback, SetPresetPlaybackWithNoData, SetProductDetailIdList, SetProxiedDeviceCount, SetRealTimeClockWithNoData, SetZeroDMXBlockAddress, SetZeroPersonality |
82.6 | 1 | SetOutOfRangePersonality |
81.8 | 1 | ResetSensorValue |
80 | 25 | CapturePreset, GetDefaultSlotInfoWithData, GetInvalidSensorDefinition, GetLanguageCapabilities, GetLanguageCapabilitiesWithData, GetMaxPacketSize, GetParamDescriptionForNonManufacturerPid, GetPerformSelfTest, GetPerformSelfTestWithData, GetSlotInfoWithData, ResetFactoryDefaultsWithData, SetDeviceInfo, SetDeviceModelDescription, SetDeviceModelDescriptionWithData, SetDisplayInvertWithNoData, SetDisplayLevelWithNoData, SetIdentifyModeWithNoData, SetManufacturerLabel, SetManufacturerLabelWithData, SetOversizedDMXBlockAddress, SetPerformSelfTest, SetRealTimeClock, SetSensorDefinition, SetSoftwareVersionLabel, SetSupportedParameters |
78.3 | 2 | GetOutOfRangePersonalityDescription, SetOutOfRangeStartAddress |
77.3 | 1 | ResetAllSensorValues |
76 | 1 | SetNonAsciiLanguage |
72.7 | 2 | ResetUndefinedSensorValues, SetStartAddress |
72.2 | 1 | GetParamDescription |
72 | 2 | GetClearStatusMessages, GetZeroPersonalityDescription |
68 | 4 | GetInvalidSensorValue, GetSensorDefinitionWithTooMuchData, GetSlotDescriptionWithNoData, GetSlotDescriptionWithTooMuchData |
66.7 | 3 | FindSubDevices, SetOutOfRangeIdentifyDevice, SetOversizedDeviceLabel |
65.2 | 1 | SetZeroStartAddress |
64 | 2 | GetParamDescriptionWithData, ResetSensorValueWithNoData |
60 | 3 | GetSensorDefinitionWithNoData, ResetFactoryDefaults, SetOversizedPersonality |
58.8 | 1 | SetPersonality |
52.4 | 1 | SetIdentifyDeviceWithNoData |
52.2 | 1 | SetOversizedStartAddress |
52 | 2 | AllSubDevicesDeviceInfo, GetSensorValueWithNoData |
48 | 1 | MuteDeviceWithData |
40 | 1 | UnMuteDeviceWithData |
I plan to continue performing this analysis at future plugfests so we’ll have an idea of how RDM implementations are improving. It will be useful to track the maturity of RDM responders and well as monitor the diversity of manufacturers and devices we see at the plugfests. Judging by the amount of work that got done over the 2 days I think we can expect even better scores next time! Stay tuned for another report after the Dallas plugfest in January 2013.