failed pipeline jobs

LabKey Support Forum
failed pipeline jobs Ben Bimber  2011-01-14 10:52
Status: Closed
 
I have 2 questions related to deleting old pipeline jobs:

1. We have a pipeline that executes on a client server using activeMQ. During the debug process, a number of bugs were exposed in the pipeline code that caused jobs to unexpectedly fail. Some of the time these failures caused the entire pipeline process to stop, leaving the temp directory and files on the client machine. Is it safe to delete these unwanted files?

2. Related to question 1: there are a bunch of pipeline jobs that have 'ERROR' status in the folder where I ran them. Can I safely either 'complete' or 'fail' them just to get them out of the way?

Now to pipeline jobs that completed successfully:

3. Many pipeline jobs have completed normally. The results are exposed to users through a pipeline files webpart. Each pipeline job gets a subfolder. I have found that users often look at them, and if they want to change something in the result, they often delete the pipeline results and restart a pipeline job. As far as I'm aware, even though these are presumably deleted through labkey via webDAV, there will still be DB records pointing to a non-existent file. Any harm here? Assuming users do legitimately want to get rid of some pipeline runs, is there a preferred way to do so?

Thanks.
 
 
jeckels responded:  2011-01-19 17:20
1. Yes.
2. You can mark them as Complete or just Delete them.
3. Yes, there will still be entries in some of the database tables. That shouldn't cause any problems. If a user tries to grab a data file that was referenced in one of the runs, it will be reported as not available on disk. If users want to clean up the list, they can just delete the runs through the standard UI. Note that this is completely separate from deleting the files on disk.

Thanks,
Josh