Suppose customers join a queue with a poisson arrival rate . If a customer is not served within a unit of time, she abandons the queue. Customers are served by two servers: one of the servers runs the first-come-first-served (FCFS) policy and the other one the last-come-first-served (LCFS) policy. The FCFS server has a service time that is iid exponentially with mean $\lambda m$, where $\lambda<1$. The LCFS server has a service time that is iid exponentially with mean $\delta m$, where $\delta<1-\lambda$. A customer departs the queue after being served by either of the servers. I would like to show that the average length of the queue is at least $(1-\delta)m -o(m)$. Any input will be appreciated!
P.S. A special case of this problem when $\delta=0$ has been solved here Average queue length with impatient customers while the intuition seems similar, I have not been able to adapt this approach.