I have something similar running which is described in a nearby thread:
http://www.zoneminder.com/forums/viewtopic.php?t=5549
Server A is a database and web server. It happens to have a USB camera attached.
Server B is a remote ZM installation. It has another USB camera attached.
Users point their web browsers at Server A, where they can view images and events from both cameras.
There are outline details of how to make this work in that thread. Basically server B acts as a remote IP camera for server A. You configure server B with a basic ZM installation and a local camera (or video capture card in your case), then configure server A with a remote camera pointing at
http://serverB/cgi-bin/zms?...etc. This gives a motion JPEG stream, which server A captures and processes continuously (this is done in the background by the zmc process, whether or not anyone is actually looking at the images). Any users viewing will see streams from Server A, served from its local buffer of the images being streamed from Server B.
This approach does require that images from both cameras are recorded and analysed for motion on Server A, which means Server A will become a bottleneck as more and more remote cameras are added.
Anyway, is that close to what you want? Regards, Brian.