Very new to all this so apologies if this is a dumb question...and if the answer is RTFM - I've been reading a lot !!!!
I'm trying to get my head around what protocols are being used by this plugin and how audio is being handled.
With the Openfire/Candy plugin, what protocol is being used to deliver audio from different clients?
If I connect using a browser with flash/PC mic, am I restricted to using RTMP for audio?
What does it mean to configure the browser component like this with SIP?
If another user connects with a SIP phone, are they using RTP for audio?
What is the purpose of the RTMP server, the red5 Voicebridge server and the jVoiceBridge server.
Is the org.red5.server.webapp.voicebridge.Application server component (or something else) converting the RTMP audio stream into RTP for use by the voicebridge and then back to RTMP for delivery to flash based clients?
Sorry, pretty confused by all this as you can tell ...