Problems with xBee

All hardware design discussions

Moderators: seaton, strogg

Problems with xBee

Postby JonSenior » Tue Sep 25, 2007 7:52 pm

OK. I've been reading through the spec sheets ('Cause I'm that sort of an exciting guy!) and I unless someone with more experience with them can correct me I don't think the xBees are going to be any use. The problem is taking control over the data transmission. The devices are designed to work as part of a clever multipoint data network. Our requirement is that from the point when we get a trigger signal from the camera to the transmission of our data packet should be as small as possible.

To force transmission involves sending a series of 3 signals to the chip via the UART (at a maximum speed of 115kbps). Each of these signals consists of the following ASCII string "ATxx" where xx is either GT or CC. This is assuming (falsely it appears) that we don't have to provide any parameters. Given the maximum UART speed, we can only send 14 characters before we exceed a delay of 1/1000s. This is even before data has been processed onboard the xBee, transmitted, received and decoded.

We might be able to get around this by queuing up the majority of the data first and saving the last byte until we get the signal. Alternatively (or in parallel) there is an option to use the device to send data about the status of various digital and analogue inputs. If this can be sent faster, it would eliminate the processing at the other end as the receiver will repeat the status at its matching pin and this could be used to directly trigger. I've yet to see if those pins will act as interrupts which force data transmission immediately.

Do any of the devs with experience using these devices have any idea of how to get the damn things to be a little less intelligent?
JonSenior
 
Posts: 60
Joined: Fri Sep 21, 2007 7:27 pm
Location: Paris, France

Wireless networks

Postby Rudeofus » Tue Sep 25, 2007 9:16 pm

If these Xbees are meant to be used in any kind of wireless network, you can safely assume that there are some methods implemented for channel arbitration, inclusion of new network partners, and so on. You can safely assume that there will be delays and latencies in the data transmission against which the RS232 max latency pales. For bluetooth this is definitely correct, I'd be very surprized if ZigBee was any different/better (how would they?).

Think not just of the latency for getting one data set transmitted. Please consider what happens if that data packet gets lost! Do these systems detect the data loss? Do they attempt retransmits some time later? When? Do they lose synchronization within the network? How long does it need to reestablish communication? Even the cactus triggers automatically transmit the signal several times to improve the odds of successful triggers ...
Rudeofus
 
Posts: 27
Joined: Sun Sep 23, 2007 11:28 pm

Postby Thonord » Tue Sep 25, 2007 11:16 pm

Nordic told me about a company, Sparkfun, that make modules based upon Nordics RF chip and a 8051 compatible MCU with varying degree of "RF networking capability" coded into the 8051.
While this may be interesting, the more comercial code in the 8051, the more difficult it may be ro keep this OpenSource.

Tom
Ppl who agree need normally not reply, those who disagree or have questions do.
Or - just ignore me.
Thonord
 
Posts: 50
Joined: Fri Sep 21, 2007 2:32 pm
Location: Norway

Re: Wireless networks

Postby JonSenior » Wed Sep 26, 2007 5:21 am

Rudeofus wrote:If these Xbees are meant to be used in any kind of wireless network, you can safely assume that there are some methods implemented for channel arbitration, inclusion of new network partners, and so on. You can safely assume that there will be delays and latencies in the data transmission against which the RS232 max latency pales. For bluetooth this is definitely correct, I'd be very surprized if ZigBee was any different/better (how would they?).


This is exactly what they do. They have support for retry if they don't receive an ACK, and checksums to verify data integrity. They can be given a broadcast address however in which they just transmit. This would be useful to us as it largely eliminates a lot of the latency inherent in having a network device.

Even the cactus triggers automatically transmit the signal several times to improve the odds of successful triggers ...


Given the way they work, I think this is more an artifact of the design, than designed functionality! ;-)

Thonord wrote:Nordic told me about a company, Sparkfun, that make modules based upon Nordics RF chip and a 8051 compatible MCU with varying degree of "RF networking capability" coded into the 8051.


Prices on the Hardware page of the wiki for Nordic modules are taken from Sparkfun. At the "in-between" level they package the Nordic chips with aerials or aerial mounts, all necessary passives and a radio shield for direct use with an MCU. This is currently my preferred option. It's low-level enough that it bypasses the network organisation layer of the Zigbees but high enough that it can be easily interfaced to an MCU.

Jon
JonSenior
 
Posts: 60
Joined: Fri Sep 21, 2007 7:27 pm
Location: Paris, France

Re: Wireless networks

Postby Rudeofus » Wed Sep 26, 2007 1:41 pm

JonSenior wrote:This is exactly what they do. They have support for retry if they don't receive an ACK, and checksums to verify data integrity. They can be given a broadcast address however in which they just transmit. This would be useful to us as it largely eliminates a lot of the latency inherent in having a network device.

We still have latency, because somehow the network has to insure that no more than one peer transmits at a given time. Bluetooth solves this with alotted time slots -> latency until your time slot has come, even if you broadcast and don't wait for ACKs.

JonSenior wrote:
Even the cactus triggers automatically transmit the signal several times to improve the odds of successful triggers ...


Given the way they work, I think this is more an artifact of the design, than designed functionality! ;-)


It's a smart move in their side, since they got slammed badly for >5% misfires. AFAIK they have a second version out now, with somewhat improved trigger probability (~1% misfires). I assume repeatedly sending the trigger signal did the trick, at least that's the way I'd do it with their limited hardware ...

Prices on the Hardware page of the wiki for Nordic modules are taken from Sparkfun. At the "in-between" level they package the Nordic chips with aerials or aerial mounts, all necessary passives and a radio shield for direct use with an MCU. This is currently my preferred option. It's low-level enough that it bypasses the network organisation layer of the Zigbees but high enough that it can be easily interfaced to an MCU.

The question remains whether these nordic chips fare much better than the cactus cheapies:
Do they operate at higher power levels ? no
Do they have a highly sensitive receiver ? no
Do they include highly efficient antennas ? no
Do they use a better suited modulation scheme ? yes
Can they do frequency hopping ? no
Can they do spread spectrum ? no
Can they drive multiple antennas ? no
Do they have/can they use great IF filters ? no

Hmmmmmmm.....

The only thing we'd gain from using the nordic is better behavior in the presence of weak in channel interference and the ability to transmit commands instead of simple triggers ...
Rudeofus
 
Posts: 27
Joined: Sun Sep 23, 2007 11:28 pm

Re: Wireless networks

Postby JonSenior » Wed Sep 26, 2007 5:16 pm

Rudeofus wrote:The question remains whether these nordic chips fare much better than the cactus cheapies:
Do they operate at higher power levels ? no

That's a legal issue, no way around that one. And we're comparing apples and oranges as the Cacti use 433Mhz and the Nordics use 2.4GhZ
Do they have a highly sensitive receiver ? no
Do they include highly efficient antennas ? no

One of the modules I detail in the Hardware page provides a standard 50Ohm aerial connection. A decent whip aerial is a about $7 from the same supplier.
Do they use a better suited modulation scheme ? yes
Can they do frequency hopping ? no

Yes. But we have to code it. The device offers a total of 125 frequencies and is designed for simultaneous reception on two channels with 8Mhz separation.
Can they do spread spectrum ? no
Can they drive multiple antennas ? no

Presumably with the aid of an external radio amplifier they can. Do we need multiple aerials?
Do they have/can they use great IF filters ? no

Why can't they use them?

Hmmmmmmm.....

The only thing we'd gain from using the nordic is better behavior in the presence of weak in channel interference and the ability to transmit commands instead of simple triggers ...


See above for reasons why I disagree. Although I freely admit to not being a radio engineer.

It strikes me that the same facts are true of all the radio "modules" and that I presume that your proposal is that we build our own transmitter. I would lean away from this for simple reasons of complexity. Unless doing so really cripples the capabilities of the production (Reduced maximum sync speed for example) I think the added complexity and debugging nightmares of a custom designed RF unit in addition to the added processing load on the MCU don't make for a viable project. At least... not for home build.

The next area on my list to investigate is the "traditional" SIL radio modules. These don't appear to answer a great many of the points on your list, but do allow us to provide our own Just Good Enough (tm) encoding scheme.

AFAIK the V2 Cacti solved the problem of random triggering with the Canon flashes, but the reliability remains pretty much where it was. If I decide not to send my faulty V2s back to China, but to keep them for investigation / disassembly then I'll be able to compare them to the V1 16channels that are (supposedly) on their way!

Jon
JonSenior
 
Posts: 60
Joined: Fri Sep 21, 2007 7:27 pm
Location: Paris, France

Postby JonSenior » Wed Sep 26, 2007 5:19 pm

Rudeofus: Looking at your résumé in the Welcome post, it is clear that your qualifications in this area exceed mine by several orders of magnitude. In the quite likely event that I missed it, what would your proposal be for the RF side of the project?
JonSenior
 
Posts: 60
Joined: Fri Sep 21, 2007 7:27 pm
Location: Paris, France

Postby brittonphotography » Thu Sep 27, 2007 12:15 am

Jon you mentioned Zigbee in your posts...

while talking to the Xbee rep, we discussed the zigbee and decided that it would indeed have a long latency time for tx/rx

the xbee with the point to point and point to multipoint tx are better than the zigbee mesh tx...

not my area of expertise.
but the rep seemed to think the xbee could handle the latency problem...

course he is a sales rep so he is trying to sell his product.

is the xbee really not adequate?
brittonphotography
 
Posts: 25
Joined: Fri Sep 21, 2007 8:52 pm

Postby JonSenior » Thu Sep 27, 2007 12:31 am

The problem that I identified was not transmission latency, but the speed at which we can actually force the chip to send. If we can indeed send all but the last byte in advance and only send that when we trigger it might be achievable. The problem would then be recovery from that point if we need to do something else (Like send power information!)

Jon
JonSenior
 
Posts: 60
Joined: Fri Sep 21, 2007 7:27 pm
Location: Paris, France

Postby brittonphotography » Thu Sep 27, 2007 1:06 am

is power information sent directly with flash trigger in E/ITTL modes or is that pre-loaded onto flash as well?
that could solve that problem.

If we are doing quench signal to do manual power would that work...
Send all but last bit of fire signal... Send fire bit
Send all but last bit of quench signal... send quench bit.
what is time lapse between fire bit and quench bit?


Seeing 1/833s time for fire for good est for full power (for canon)
~1/19,000 for 1/32 power

http://www.joesmalley.com/flashes/ (other flashes slower)

250,000 bps rate for tx on xbee

8 bits to transmit quench signal...
8bits/250,000 bps =1/31,250s

which is quicker than the 1/19,000 for 1/32 power on canon flash

just at the 1/31000s for 1/64 power canon flash so we may not be able to get that far with the send fire, send quench signal routine.



Wouldn't another option be to have the 7 bits of the signal before the 1bit fire signal have the data necessary to tell the receiver to emit quench signal after x time without needing another signal from the transmitter??
seems like we would have enough data in those 7 bits to set quench speed.
brittonphotography
 
Posts: 25
Joined: Fri Sep 21, 2007 8:52 pm

Postby JonSenior » Thu Sep 27, 2007 8:24 am

brittonphotography wrote:is power information sent directly with flash trigger in E/ITTL modes or is that pre-loaded onto flash as well?
that could solve that problem.

If we are doing quench signal to do manual power would that work...
Send all but last bit of fire signal... Send fire bit
Send all but last bit of quench signal... send quench bit.
what is time lapse between fire bit and quench bit?



OK. There are two or three separate things here.

We have original TTL using a quench signal
We have E TTL using timing information sent to the flash.
We have E TTL over wireless IR.

We can hijack these to allow manual remote power control of the connected flash. To do this requires understanding of the protocols involved for the different flashes, or at the least, correct pin-outs for the different hot-shoes.

Alternatively, we can capture the IR emitted by the camera controlling the flashes and retransmit it as RF before converting it back to at the flash. This would allow the photographer to retain fully automatic flash control and is the method used by the RadioPopper device.

250,000 bps rate for tx on xbee

8 bits to transmit quench signal...
8bits/250,000 bps =1/31,250s


The problem is not the data transmission rate, but the rate at which we can get data into the buffer. The interface for doing that is only 115kbps.

My personal goal would be to provide the first two methods above. Namely the use of a quench signal generated by the receiver to control flash power (This would probably require the receiver to have at least some knowledge of the characteristics of the flash it's connected to, above and beyond the physical connection) and in a more sophisticated manner, the use of the data link present on modern flashes to set the flash power.

The latter is "cooler" from a geek point of view, and seemingly not insurmountable. The former provides support for the older generation of TTL flashes (More "Strobist"!) and may even add new functionality to certain older models.

Jon
JonSenior
 
Posts: 60
Joined: Fri Sep 21, 2007 7:27 pm
Location: Paris, France

Postby Rudeofus » Thu Sep 27, 2007 9:09 pm

JonSenior wrote:The problem is not the data transmission rate, but the rate at which we can get data into the buffer. The interface for doing that is only 115kbps.

If we really get 115200 bps, we'd be set. However, we won't get full data rate, if the RF conditions are nasty. And they WILL be nasty, really nasty in urban areas. Why do you think do the cactuses produce so many misfires ?

First: You can get 115kbps by transmitting @115Mbps for one ms every second, which gives you decent data rate, but extreme latencies. Zigbee and Bluetooth won't be THAT bad, but there will be delays, because these are network protocols, meant to facilitate access for multiple clients. What we would actually need would be a modem set which assumes exactly one transmitter and multiple receivers, no channel access arbitration of any kind, all transmissions broadcast to all receivers concurrently.

Second: How are misfires handled? Would we detect that a frame was missed? Do we retransmit every frame multiple times? How long do we wait before we assume a frame has been lost and retransmit? How do we make sure that if we retransmit a frame, that it doesn't meet exactly the same nasty RF conditions ? How long will it take to frequency hop? Or switch antenna properties?

This is where the data rate will go down the drain (unless a proper protocol is used), not the lame speed of RS232.
Rudeofus
 
Posts: 27
Joined: Sun Sep 23, 2007 11:28 pm

We can't use XBee :-(

Postby Rudeofus » Sat Sep 29, 2007 11:55 pm

It appears that pocket wizard has a patent on radio links attached to photographic equipment where micro processors are used on both end of the link. This pretty much kills all suggestions of smart RF modules, since all of them have CPUs inside in order to be smart :-(

So either we find really dumb :-P receiver modules without CPU circuitry inside, or we have to hack up something ourselves ...
Rudeofus
 
Posts: 27
Joined: Sun Sep 23, 2007 11:28 pm

Postby brittonphotography » Sun Sep 30, 2007 9:28 pm

From http://en.wikipedia.org/wiki/Microprocessor

"There is no universal consensus on what defines a "microprocessor", but it is usually safe to assume that the term refers to a general-purpose CPU of some sort and not a special-purpose processor unless specifically noted."

Is a IC or PIC chip a microprocessor?
or by using those do we bypass the pocket wizard patent.
brittonphotography
 
Posts: 25
Joined: Fri Sep 21, 2007 8:52 pm

Postby Rudeofus » Sun Sep 30, 2007 10:29 pm

brittonphotography wrote:From http://en.wikipedia.org/wiki/Microprocessor

"There is no universal consensus on what defines a "microprocessor", but it is usually safe to assume that the term refers to a general-purpose CPU of some sort and not a special-purpose processor unless specifically noted."

Is a IC or PIC chip a microprocessor?
or by using those do we bypass the pocket wizard patent.

Since we can safely assume that pocketwizard did not plan on using the 80486 for their pocket wizards back then :-P, we can assume that their patent includes micro controllers, too. If their patent lawyers thought that their patents included micro controllers, I assume most legal folks (including judges) will. Note that any PIC or AVR you can buy has more processing power, memory, registers and a larger instruction set than the first micro processor. If using micro controllers did not infringe on the patent, 10^6 versions of the pocket wizard would exist by now and this forum had no reason to exist ...

But we don't need a CPU/MCU in our receiving circuit. An FPGA can do the same (albeit with more effort) without replicating micro controller behavior. The idea would be to build and discuss prototypes based on micro controllers, but if we want to build a final design and share it with others (or sell it), an FPGA based implementation would replace the micro controller. Even if we use a state machine, the state machine wouldn't be generic enough to qualify as CPU or MCU of any kind.

Apart from FPGA, we could try to use some special purpose processors like signal processors (the AD2181 doesn't even have a push/pop instruction, how much more special purpose can it get?), some are even supported by the GNU toolchain.

PS: Since the pocketwizard patent so specifically mentions that their invention employs a CPU, I assume there was prior art for triggering a flash via a simple RF link.
Rudeofus
 
Posts: 27
Joined: Sun Sep 23, 2007 11:28 pm

Next

Return to Hardware

Who is online

Users browsing this forum: No registered users and 1 guest

cron