**Minimize maximum lateness**

Since we have chosen the greed with problem like interval partitioning, coin change problem, let’s continue with it for one more problem at least. Today’s problem is to **minimize maximum lateness** of a task. Let me clarify problem in detail: We have a processor which can process one process a time and as always given list of processes to be scheduled on that process, with intention that maximum late process should be minimized. Contrary to previous problems, this time, we are not provided with start time and end time, but we are given length of time (ti) process will run and deadline it has to meet (di), fi is actual finish time of process completion.

Lateness of a process is defined as

l(i) = max{0, fi − di}, i.e. the length of time past its deadline that it finishes.

Goal here schedule all jobs to minimize maximum lateness L = max {l(i)} For example:

**Minimize maximum lateness : Optimization strategy**

Let’s decide our optimization strategy. There are some order in which jobs can be decided: shortest job first, earliest deadline first, least slack time first.

Let’s see if any of above strategy works for optimal solution. For shortest processing time first, consider example P1 = (1,100) P2 = (10, 10). If we schedule shortest job first as in order (P1, P2), lateness will be 91, but it we take them as (P2, P1), lateness will be 0. So, clearly taking shortest process first does not give us optimal solution.

Check for the smallest slack time approach. See if you can come up with some counter example that it does not work.

That’s leaves us with only one option, take the process which has most pressing deadline, that is the one with smallest deadline and yet not scheduled. If you have noticed, example given for problem statement is solved using this method. So, we know it works.

**Minimize maximum lateness : Greedy algorithm **

1. Sort all job in ascending order of deadlines 2. Start with time t = 0; 3. For each job in list 3.1 Schedule the ob at time t 3.2 Finish time = t + processing time of job 3.3 t = finish time 4. Return (start time, finish time) for each job

Code for greedy algorithm returns a schedule and maximum lateness which will occur with chosen schedule.

**Minimize maximum lateness implementation**

Complexity of implementation is dominated by sort function , which if good sorting algorithm is used is O(n log n), rest of processing takes O(n).

Please share your suggestions or if you find something is wrong in comments. We would love to hear what you have to say. I you find this post interesting, please feel free to share or like.