poj 1180 Batch Scheduling
Batch Scheduling
Time Limit: 1000MS | Memory Limit: 10000K | |
Total Submissions: 4395 | Accepted: 2013 |
Description
There is a sequence of N jobs to be processed on one machine. The jobs are numbered from 1 to N, so that the sequence is 1,2,..., N. The sequence of jobs must be partitioned into one or more batches, where each batch consists of consecutive jobs in the sequence. The processing starts at time 0. The batches are handled one by one starting from the first batch as follows. If a batch b contains jobs with smaller numbers than batch c, then batch b is handled before batch c. The jobs in a batch are processed successively on the machine. Immediately after all the jobs in a batch are processed, the machine outputs the results of all the jobs in that batch. The output time of a job j is the time when the batch containing j finishes.
A setup time S is needed to set up the machine for each batch. For each job i, we know its cost factor Fi and the time Ti required to process it. If a batch contains the jobs x, x+1,... , x+k, and starts at time t, then the output time of every job in that batch is t + S + (Tx + Tx+1 + ... + Tx+k). Note that the machine outputs the results of all jobs in a batch at the same time. If the output time of job i is Oi, its cost is Oi * Fi. For example, assume that there are 5 jobs, the setup time S = 1, (T1, T2, T3, T4, T5) = (1, 3, 4, 2, 1), and (F1, F2, F3, F4, F5) = (3, 2, 3, 3, 4). If the jobs are partitioned into three batches {1, 2}, {3}, {4, 5}, then the output times (O1, O2, O3, O4, O5) = (5, 5, 10, 14, 14) and the costs of the jobs are (15, 10, 30, 42, 56), respectively. The total cost for a partitioning is the sum of the costs of all jobs. The total cost for the example partitioning above is 153.
You are to write a program which, given the batch setup time and a sequence of jobs with their processing times and cost factors, computes the minimum possible total cost.
A setup time S is needed to set up the machine for each batch. For each job i, we know its cost factor Fi and the time Ti required to process it. If a batch contains the jobs x, x+1,... , x+k, and starts at time t, then the output time of every job in that batch is t + S + (Tx + Tx+1 + ... + Tx+k). Note that the machine outputs the results of all jobs in a batch at the same time. If the output time of job i is Oi, its cost is Oi * Fi. For example, assume that there are 5 jobs, the setup time S = 1, (T1, T2, T3, T4, T5) = (1, 3, 4, 2, 1), and (F1, F2, F3, F4, F5) = (3, 2, 3, 3, 4). If the jobs are partitioned into three batches {1, 2}, {3}, {4, 5}, then the output times (O1, O2, O3, O4, O5) = (5, 5, 10, 14, 14) and the costs of the jobs are (15, 10, 30, 42, 56), respectively. The total cost for a partitioning is the sum of the costs of all jobs. The total cost for the example partitioning above is 153.
You are to write a program which, given the batch setup time and a sequence of jobs with their processing times and cost factors, computes the minimum possible total cost.
Input
Your program reads from standard input. The first line contains the number of jobs N, 1 <= N <= 10000. The second line contains the batch setup time S which is an integer, 0 <= S <= 50. The following N lines contain information about the jobs 1, 2,..., N in that order as follows. First on each of these lines is an integer Ti, 1 <= Ti <= 100, the processing time of the job. Following that, there is an integer Fi, 1 <= Fi <= 100, the cost factor of the job.
Output
Your program writes to standard output. The output contains one line, which contains one integer: the minimum possible total cost.
Sample Input
5 1 1 3 3 2 4 3 2 3 1 4
Sample Output
153
题意:机器一共需要处理n个作业,它可以批量的处理作业,即连续的几个作业可以放在一起处理,假设此时要批量处理的作业从作业i开始,到作业j为止,并且之前完成的作业已经耗掉了T单位的时间,则完成当前的一批作业
需要花费(T+S+Ti+...+Tj)*(Fi+...+Fj).那么把所有的作业分成若干批后进行处理,求最少花费。
思路:斜率dp
倒着考虑比较好找转移方程,设dp[i]:完成作业i到N的最小花费
则dp[i]=min(dp[i],dp[j]+(S+Ti+...+Tj-1)*(Fi+...+FN));
设sumT[i]=Ti+...+TN
sumF=Fi+...+FN
再设fj(x)=(-sumT[j])*sumF[x]+dp[j]
那么化简后得到
dp[i]=min(fj(i)|i<j<=n+1)+(S+sumT[i])*sumF[i]
这时候min(fj(i)|i<j<=n+1)这部分可以利用单调队列求解
AC代码:
#define _CRT_SECURE_NO_DEPRECATE #include<iostream> #include<algorithm> #include<vector> #include<cstring> #include<string> #include<cmath> using namespace std; const int INF = 0x3f3f3f3f; const int N_MAX = 10000 + 20; int N,S; int T[N_MAX], F[N_MAX]; int sumT[N_MAX], sumF[N_MAX],dp[N_MAX],deq[N_MAX]; int f(const int& j,const int& x) { return -sumT[j] * sumF[x] + dp[j]; } bool check(int f1,int f2,int f3) { int a1 = -sumT[f1], b1 = dp[f1]; int a2 = -sumT[f2], b2 = dp[f2]; int a3 = -sumT[f3], b3 = dp[f3]; return (a2 - a1)*(b3 - b2) >= (b2 - b1)*(a3 - a2); } int main() { while (scanf("%d%d",&N,&S)!=EOF) { memset(sumT,0,sizeof(sumT)); memset(sumF,0,sizeof(sumF)); memset(dp,0,sizeof(dp)); for (int i = 1; i <= N; i++) scanf("%d%d",&T[i],&F[i]); for (int i = N; i >= 1; i--) { sumT[i] = sumT[i + 1] + T[i]; sumF[i] = sumF[i + 1] + F[i]; } int s = 0,t=1; deq[0] = N+1; dp[N + 1] = 0; for (int i = N; i >= 1; i--) { if (i < N) { while (s+1<t&&check(deq[t - 2], deq[t - 1], i + 1))t--; deq[t++] = i+1; } while (s+1<t&&f(deq[s], i) >= f(deq[s+1], i))s++; dp[i] = f(deq[s], i) + (S + sumT[i])*sumF[i]; } printf("%d\n",dp[1]); } return 0; }