We consider an approach for merging message streams from producers distributed over a network into a single stream. In this approach, the messages are merged using an algorithm that is deterministic. Therefore, if message streams are sent to multiple "mergers" each executing the deterministic merge, then messages will be delivered in consistent total order. The technique is therefore an efficient solution to atomic broadcast and global atomic multicast. We present a specific deterministic merge algorithm, the Bias Algorithm, that exploits knowledge of the expected message rates of the producers. We show that the Bias Algorithm has the least total expected delay when messages are produced according to a binomial process with known rate parameters. Our motivating application is middleware for a widely-distributed publish-subscribe system, in which a subscriber can be intermittently connected, and where subscribers require reliable ordered delivery. Messages are logged to stable storage at "logger" nodes that serve as the sources for the message streams to be merged. As a result, (1) it is not necessary for a second stable log to record the merge order; and (2) the merge function can be replicated at independent sites, and each site is guaranteed to merge messages in the same order. We validate our work using both Dynamic Programming theory and simulations.