We study the problem of assigning $K$ identical servers to a set of $N$ parallel queues in a time-slotted queueing system. The connectivity of each queue to each server is randomly changing with time; each server can serve at most one queue and each queue can be served by at most one server during each time slot. Such a queueing model has been used in addressing resource allocation problems in wireless networks. It has been previously proven that Maximum Weighted Matching (MWM) is a throughput-optimal server assignment policy for such a queueing system. In this paper, we prove that for a system with i.i.d. Bernoulli packet arrivals and connectivities, MWM minimizes, in stochastic ordering sense, a broad range of cost functions of the queue lengths such as total queue occupancy (which implies minimization of average queueing delays). Then, we extend the model by considering imperfect services where it is assumed that the service of a scheduled packet fails randomly with a certain probability. We prove that the same policy is still optimal for the extended model. We finally show that the results are still valid for more general connectivity and arrival processes which follow conditional permutation invariant distributions.