E1.20 RDM (Remote Device Management) Protocol Forums  

Go Back   E1.20 RDM (Remote Device Management) Protocol Forums > RDM Developer Forums > RDM Timing Discussion

RDM Timing Discussion Discussion and questions relating to the timing requirements of RDM.

Reply
 
Thread Tools Search this Thread Display Modes
Old July 5th, 2009   #1
kster
Junior Member
 
Join Date: Jun 2009
Posts: 5
Default Summed Break and MAB time

In the RDM specifications in table 3-3, line 1, the maximum break and MAB are specifed as 1 second for receiving (responder). So when I receive a break of 0.99second followed by a MAB of 0.99second, this is still a valid signal?

If this is the case it would be very difficult to comply to this when the UART framing error is used to detect the break. This can only detect the start of the break and not the end(start of the MAB).

I find it also very unclear what must happen when the timing of the incomming signal are not within spec. For example when the MAB is less then 8uS. Must then the packet be dropped? This would also be very difficult to implement using only a UART.

Regards,

Kurt
kster is offline   Reply With Quote
Old July 5th, 2009   #2
ericthegeek
Task Group Member
 
Join Date: Aug 2008
Posts: 375
Default

Your questions pertain more to E1.11 (DMX512A) than to E1.20 (RDM).

> So when I receive a break of 0.99second followed by
> a MAB of 0.99second, this is still a valid signal?

No, it is not a valid signal. Per section 9.2 of E1.11 (DMX512A), a receiver is considered to have "lost data input" if it does not see the falling edge of a break within 1.25 seconds of the previous break's falling edge.

RDM has much tighter restrictions on the break length, so while a DMX controller may send a 0.99 second break, and RDM controller's break must be less than 352 micro-seconds.

> I find it also very unclear what must happen when the
> timing of the incomming signal are not within spec.

What happens when a device loses DMX signal is entirely up to you. You can hold the last data forever, you can shutdown after some timeout, or you can start strobing randomly. All that's required is that you document the behavior.

To quote section 9.2 of E1.11:
"Although this Standard does not specify loss of data handling procedures, manufacturers shall state what their Loss of Data handling procedures are.

> when the MAB is less then 8uS. Must then the
> packet be dropped?

The packet may be dropped, but the standards don't require that the packet be dropped. Older versions of the DMX standard (USITT DMX512/1986) allowed a 4 micro-second MAB, so a well designed receiver may wish to support a shorter MAB.

For RDM, E1.20 has a bit of guidance about when an RDM packet may be considered "lost", but there are no absolute rules. Any RDM device is free to drop any packet at any time for any reason. It's up to the controller to determine how to handle these conditions. If a packet doesn't meet the timing specs, you can ignore it, or you can try to respond if you can. It's up to you.
ericthegeek is offline   Reply With Quote
Old July 6th, 2009   #3
kster
Junior Member
 
Join Date: Jun 2009
Posts: 5
Default

Thanks for the very clear answer. This specification makes the implementation also a lot easier.
kster is offline   Reply With Quote
Old July 6th, 2009   #4
dangeross
Junior Member
 
Join Date: Feb 2009
Posts: 13
Default

E1.20 - 2006 states in section 3.2.1 Responder Packet Timings "RDM responders shall conform to the timing specified in Table 3-3". Table 3-3 line 1 shows max Receive Break and MAB times as 1S. Later in the same section it is stated "Break timing has been modified from the DMX512 standard. The minimum has been lengthened to allow In-Line devices to decrease this time as they pass the data through".
I would interpret the above to mean that a responder must accept packets meeting the timing requirements of Table 3-3 line 1 even though the maximum time values are greater than the allowed Controller Packet maximums.
As far as measuring the break time if your micro supports edge triggered GPIO interrupts on your UART Rx pin you can enable a rising edge interrupt as soon as the UART detects a break and then measure the time to the rising edge of the DMX/RDM break.
dangeross is offline   Reply With Quote
Old July 6th, 2009   #5
ericthegeek
Task Group Member
 
Join Date: Aug 2008
Posts: 375
Default

> I would interpret the above to mean that a responder
> must accept packets meeting the timing requirements
> of Table 3-3 line 1 even though the maximum time
> values are greater than the allowed Controller Packet
> maximums.

This is because an RDM responder also needs to work with DMX transmitters developed to the DMX512/1990 and E1.11/DMX512A standards that allow a 1 second break. Thus, when receiving data they must accept the long break.

A device built to the E1.20/RDM standard cannot generate a 1 second break.
ericthegeek is offline   Reply With Quote
Reply

Bookmarks

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Time between controller packets Dan Scheurell RDM Timing Discussion 1 April 1st, 2007 11:38 PM
Exceeding MAB time Dan Scheurell RDM Timing Discussion 2 March 30th, 2007 08:23 AM


All times are GMT -6. The time now is 02:53 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.