You want to make as few database or Excel reads during your model run as possible, as this will slow your model run speed. At the same time, your table is too big to import all at once.
Here is one concept you might use to strike a balance between these competing issues:
Use a queue in place of the source. The queue can act like a source by creating flowitems on demand via its OnMessage trigger.
Create a control object in your model with message trigger logic that will query your database. Use a LIMIT clause to specify how many rows to read in, and to offset from the beginning of the database table. Choose some amount of arrivals that will work for you, say 100 rows at a time (you choose the right quantity for you). Create a global table, or a label table on the queue, and at model time 0 send a message to the control object to import the first 100 rows.
Next create a message loop that will start at row 1 and send a delayed message that will fire on the row 1 scheduled time. When the message is received, the flow item(s) are created (a queue can create flowitems based on a message), and a new delayed message is sent that will arrive for the row 2 scheduled time. When you get to the row 100, read a new 100 rows from the database and start all over.
This is all somewhat abstract without an example model. I'll see if I can't cook something up soon and update this answer. Someone else may have a nice way of doing it in Process Flow, so we'll see if any other suggestions come in as well.