That is possible, though I would use an instanced process flow to model the functionality and use a queue as a stand-in for the combiner. This ends up being easier than trying to work around the requirements of the combiner, in my opinion.
The main parts (pallets) are still just send to the combiner queue (cq) as before. The max. content of the cq is set to 1. The other queues don't send their items directly to an object but push them to a global list instead. The output connections are used to reference the cq as the list partition. That way the cq can later only pull from connected queues.
The list is set up with a 'name' field that simply returns the name of the value that is pushed to the list (item).

The process flow reacts to a pallet entering the cq. It then pulls the item with the matching name from the global list, moves it into the pallets and waits for a pre-determined process time. Afterwards the output of the cq is opened and the pallet can leave.
The output is closed again in the 'On Exit' trigger of the cq (in addition to closing it 'On Reset', so it's closed when the model starts).

You can link as many of these combiner queues to the process flow as you want. They will all run the same logic.

custom_combiner_fm.fsm