Abstract
In this paper, we study a scheduling problem of jobs from two different queues on several parallel servers. Jobs have exponentially distributed processing times, and incur costs per unit of time, until they leave the system, and there are no arrivals to the system at any time. The objective is to find the optimal strategy, i.e., to allocate the servers to the queues, such that the expected holding costs are minimized. We give a sufficient condition for which it is always optimal to allocate the servers only to jobs of a certain queue. Finally, the case of two servers is completely solved.
Original language | English |
---|---|
Pages (from-to) | 127-148 |
Number of pages | 22 |
Journal | Mathematical Methods of Operations Research |
Volume | 66 |
Issue number | 1 |
DOIs | |
Publication status | Published - Aug 2007 |
Keywords
- Optimal control
- Parallel servers
- Scheduling problem