Unpack tar files in legacy objects

Project:RUcore Jobs & Reports
Component:Job - test
Category:task
Priority:critical
Assigned:dhoover
Status:Moved to JIRA
Description

In accordance with our archival master policy, we need to unpack all tar files in legacy objects and create separate archival masters. We should review the script to do this in sw_arch and then schedule the activity on the production server (after appropriate testing on development).
rcj- 03/28/2014

Comments

#1

Priority:normal» critical

SW-Arch recommends running on each content model individually: first, on devel-test, then on staging, then on production. The first content model we will do is Photograph.
Please update this issue with a report of the objects (number) and files (number) in each run, and any exceptions reported. For each content model, there will be three job runs.
After Photographs, run on Document.
It would be helpful to list here all the Content Models we have in Fedora.

#2

Facet Results for this Query

Refine Search? 9393 hits for photograph
Refine Search? 7639 hits for document
Refine Search? 4574 hits for etd
Refine Search? 2769 hits for map
Refine Search? 1794 hits for record
Refine Search? 867 hits for manuscript
Refine Search? 852 hits for book
Refine Search? 850 hits for periodical
Refine Search? 735 hits for video
Refine Search? 419 hits for collection
Refine Search? 293 hits for pamphlet
Refine Search? 165 hits for stillimage
Refine Search? 69 hits for audio
Refine Search? 38 hits for dataset
Refine Search? 25 hits for transcript
Refine Search? 21 hits for movingimage
Refine Search? 12 hits for analytic

#3

Component:Job - production server» Job - test

Work on initially migrating the objects with single TAR archival datastreams that currently have and make use of the current technical metadata schema.

The following issue/job addresses migrating legacy technical metadata datastreams. In order to untar all of the legacy TAR archival datstreams this job will need to be completed.

<a href="https://software.libraries.rutgers.edu/node/3582" title="https://software.libraries.rutgers.edu/node/3582">https://software.libraries.rutgers.edu/node/3582</a>

#4

Here is the report on the test of content model Document objects on November 15, 2016.

#5

Here is am Excel file with summary results for a dry run on the first 8000 or 16000 or so objects on rep-test. The dry run took a day and a half to run. See attached file. We are currently running the dry run script on the second half.

#6

The complete dryrun summary is done and available here in the attached excel file.

#7

Here is the report summary for the 1216 dryrun test.

#8

#9

Here is the excel spreadsheet of the real untar run on rep-test 12/16/16-12/18/16.
Time for the test of real untar script on rep-test - Dec. 16-18, 2016
triggs@rep-devel:/mellon/htdocs/dlr/EDIT>

real 2507m32.859s

user 77m5.433s

sys 207m19.209s

time for file in `cat rep-test-success-list.txt`;
do
php -f foo-ndosha256-tar.php pid=$file;
done > untar-121616-successlist-realrun.log

Mon Dec 19 09:41:41 EST 2016
Total tar bytes: 119378114048
Total number of ARCH datastreams extracted: 61888
Total number of DARCH datastreams extracted: 962
Total extracted datastream bytes: 119318750097

#10

Assigned to:triggs» dhoover

The dry run script is ready for a test on staging.
-rw-r--r-- 1 triggs developers 16938 Dec 21 15:15 staging-full-list.txt
-rw-r--r-- 1 triggs developers 51200 Dec 21 15:13 ndosha256-tar.tar
-rw-r--r-- 1 triggs developers 1970 Dec 21 15:12 ndosha256-tar.readme

The readme is as follows:
This is the readme for the ndosha256-tar.sh script for running the untar and sha256 scripts in dryrun or realrun mode.

The needed scripts are bundled in a tar file in /mellon/cvsroot called "ndosha256-tar.tar" that includes the following files:
-rwxr-xr-x triggs/developers 1585 2016-12-20 10:52 ndosha256-tar.sh
-rw-r--r-- triggs/developers 22305 2016-12-15 13:23 run-ndosha256-tar.php
-rw-r--r-- triggs/developers 20301 2016-12-19 15:34 real-ndosha256-tar.php
-rwxr-xr-x triggs/developers 190 2016-12-19 16:23 getexcel.sh
-rw-r--r-- triggs/developers 1970 2016-12-21 15:11 ndosha256-tar.readme

Also in /mellon/cvsroot is a copy of this readme and any lists of target Fedora PIDs generated for a run of the script. To run the script
1) cd to the dlr/EDIT directory
2) untar ndosha256-tar.tar
3) copy down a target list file, e.g., staging-full-list.txt
4) type: time ./ndosha256-tar.sh -dryrun staging-full-list.txt

Here is an example run already on rep-test:
triggs@rep-devel:/mellon/htdocs/dlr/EDIT> time ./ndosha256-tar.sh -dryrun transcriptslist.txt
running -dryrun ndosha256-tar.php on transcriptslist.txt list
Running dryrun test...transcriptslist.txt objects
Ran on 33 untar-161220-105257-dryrun.txt objects...
Log file is 'untar-161220-105257-dryrun.log', summary file is 'untar-161220-105257-dryrun.txt'

real 0m19.425s
user 0m0.968s
sys 0m0.684s

The summary file will contain excel-ready tab separated fields with an exit status for all the objects. The lopg file will contain the full details of the run.
The summary file will be used to generate a new list file of objects with an exit status of SUCCESS. The real run can then be accomplish with a command like:

time ./ndosha256-tar.sh -realrun staging-success-list.txt

The realrun will produce an additional purge log in the form '$tstamp-dosha256-tar.log' that can be used safely to remove the old ARCH1 tar files after they have
been split out and added individually to the objects in the list.

#11

We have discovered an issue with a (relatively small) number of tar files that have Mac .DS_Store . files, e.g.
-rw-r--r-- 0 scaredpoet scaredpoet 82 Apr 3 2008 ./._2964DS2_Page_01.tiff
-rw-r--r-- 0 scaredpoet scaredpoet 69801142 Apr 3 2008 ./2964DS2_Page_01.tiff
-rw-r--r-- 0 scaredpoet scaredpoet 82 Apr 3 2008 ./._2964DS2_Page_02.tiff
-rw-r--r-- 0 scaredpoet scaredpoet 57485104 Apr 3 2008 ./2964DS2_Page_02.tiff
-rw-r--r-- 0 scaredpoet scaredpoet 82 Apr 3 2008 ./._2964DS2_Page_03.tiff
-rw-r--r-- 0 scaredpoet scaredpoet 54505324 Apr 3 2008 ./2964DS2_Page_03.tiff
We need to toss out any files beginning with ._ and only ingest ARCHs that are real files. Will work on this in the new year.

#12

The Mac .DS_Store issue is fixed and the tar repackaged and ready for the staging dry-run test.

#13

Asked Jeffery to adjust the script so it could run from any
directory and not be restricted to running from within dlr/EDIT

That was done and tested and then a new tar file was then put in place
in /mellon/cvsroot.

I used the new version and here are the results

rep-staging:/home/UNTAR # time ./ndosha256-tar.sh -dryrun staging-full-list.txt
running -dryrun ndosha256-tar.php on staging-full-list.txt list
Running dryrun test...staging-full-list.txt objects
Ran on 893 untar-170113-102851-dryrun.txt objects...
Log file is 'untar-170113-102851-dryrun.log', summary file is 'untar-170113-102851-dryrun.txt'

real 18m51.647s
user 1m33.302s
sys 3m7.872s

#14

Hi Dave,

Thanks for running this test. It seems to have been pretty quick, though from the look of it that's because so many of the objects didn't need untarring and those that did did not have a lot of files in their tars.

#15

I get the following totals from 262 success objects in untar-170113-102851-dryrun.txt:
Total tar bytes: 31306738688
Total number of ARCH datastreams extracted: 646
Total number of DARCH datastreams extracted: 4
Total extracted datastream bytes: 31304869671

607 NOT NECESSARY
262 SUCCESS
14 BAD RARCH TARS
7 NO TECHMD
2 MULTIPLE ARCH TARS
I'm attaching the excel version for people to play with.

#16

Dave, I think we have the go ahead to run the "realrun" of this on staging whenever you are ready.
./ndosha256-tar.sh -realrun staging-full-list.txt

#17

Run on rep-staging in real mode last night

The two report files it produced are attached.

#18

Thanks Dave! This looks right for staging, as we expected from the dryrun.

#19

We discussed that the original tar files that were exploded would be removed after the
new exploded files/datastreams were reviewed.

Did this happen on rep-test.ibraries.rutgers.edu yet?

#20

I believe so. I deleted them for Ashwin shortly after the run.

#21

The following totals were extracted from untar-170201-165759-realrun.txt:
Total tar bytes: 31306738688
Total number of ARCH datastreams extracted: 646
Total number of DARCH datastreams extracted: 4
Total extracted datastream bytes: 31304869671

After the realrun there should be a time-stamped purge log created in the directory where the script was run:
purge log is '$tstamp-dosha256-tar.log'", i.e., 170201-165759-dosha256-tar.log

To purge the original tar ARCH1 datastreams on rep-staging download /mellon/cvsroot/zaparchtars.sh and type:
./zaparchtars.sh /path/to/170201-165759-dosha256-tar.log

The script can be used for other ARCH1 purge logs as well.

#22

We should be ready for an initial dry-run test on production. I have created a list of 2805 cmodel-map objects on production. These are mainly simple stillimage objects of the sort we are targeting first, and represent a significant set, though a more manageable number than the 9399 cmodel-photograph objects on production.

The list can be found at
/mellon/cvsroot/prod-cmodel-map-list.txt on rep-test. The test can be run with the following command (as described in /mellon/cvsroot/ndosha256-tar.readme):
time ./ndosha256-tar.sh -dryrun prod-cmodel-map-list.txt

#23

For future discussion, here is a list of the content models as of today and how many objects have each model.

9411 hits for Document
9399 hits for Photograph
6520 hits for ETD
2805 hits for Map
1792 hits for Record
1428 hits for 'Still Image'
918 hits for Book
861 hits for Manuscript
847 hits for Periodical
744 hits for Video
292 hits for Pamphlet
259 hits for Audio
91 hits for Dataset
85 hits for 'Moving Image'
83 hits for Analytic
25 hits for Transcript
11 hits for EAD

#24

Jeffery

For the first dryrun on rep-prod, the list is not there:

dhoover@rep-devel:/mellon/cvsroot> ls -al /mellon/cvsroot/prod-cmodel-map-list.txt
ls: cannot access /mellon/cvsroot/prod-cmodel-map-list.txt: No such file or directory

#25

Oops - I forgot to copy it over to /mellon/cvsroot. I did just now.

#26

The folowing command was run on rep-prod

nohup time ./ndosha256-tar.sh -dryrun prod-cmodel-map-list.txt &

Not sure if it found any tar files to explode. Reports are attached.

#27

I'll have to look over the logs. I'm surprised that no attempts were thought necessary to unpack a tar, since in the big log I see at least one object with a tar where it couldn't find any files to extract. I'm suspicious that the $lpath was empty, suggesting that the find command failed for some reason. The command would be:
$lpath = shell_exec("$FIND $FEDORADSROOT -name \"*$llocation\"");
where $llocation is something like "rutgers-lib:24741+ARCH1+ARCH1.0".

#28

There should have been 1957 old tars in this batch, but all had the lpath failure. Any ideas why the find command would fail on prod and noth the other servers? If we can solve this, we could either rerun the dryrun on the whole list or the 1957 subset.

#29

On rep-prod /mellon/includes/incs sets FEDORADSROOT to /mellon/datastreams
which is a symlink:

dhoover@rep-prod:/mellon/includes> ls -ald /mellon/datastreams
lrwxrwxrwx 1 root root 32 Nov 17 2011 /mellon/datastreams -> /rep-prod/repository/datastreams

On the other servers it is an absolute path. So either we need to add -follow to
the find command in the script or change /mellon/includes/incs to use the absolute
path (/rep-prod/repository/datastreams) for FEDORADSROOT

#30

Thanks Dave! Why don't we try with the -follow option so that you don't have to change the incs just for this. I'll get you a new version of the scripts and we can rerun the dryrun test.

#31

I downloaded the new tar file that includes the use of the
find -follow parameter.

At 11:02pm 2/15/17 I ran the following:

nohup time ./ndosha256-tar.sh -dryrun prod-cmodel-map-list.txt &

As of noon today 2/16/17 it is still running having processed 1229 or 2805

wc -l prod-cmodel-map-list.txt 2805 prod-cmodel-map-list.txt = 2805
grep TOTAL untar-170215-230245-dryrun.log |wc -l = 1229
grep TOTAL untar-170215-230245-dryrun.log | grep NOT |wc -l = 296
grep TOTAL untar-170215-230245-dryrun.log | grep -v NOT |wc -l = 934

#32

A anomoly has been found as the nohup.out file has captured the
following messages:

rep-prod:/home/UNTAR # sort -u nohup.out
running -dryrun ndosha256-tar.php on prod-cmodel-map-list.txt list
Running dryrun test...prod-cmodel-map-list.txt objects
/usr/bin/find: `/mellon/datastreams/2016/0917/01/31/rutgers-lib_41121+PTIF-13+PTIF-13.0': Too many levels of symbolic links
/usr/bin/find: `/mellon/datastreams/2016/0917/01/31/rutgers-lib_41121+PTIF-13+PTIF-13.1': Too many levels of symbolic links
/usr/bin/find: `/mellon/datastreams/2016/0917/10/39/rutgers-lib_46533+PTIF-1+PTIF-1.1': Too many levels of symbolic links
/usr/bin/find: `/mellon/datastreams/2016/0926/13/53/rutgers-lib_41121+PTIF-11+PTIF-11.2': Too many levels of symbolic links

Where these four symlink errors show up every time find is invoked (a total of 3760). They are basically symlinks
that point to themselves. While 41121 are from 9/17/16 and 9/26/16 , 46533 is from 2/15/17 implying that
something actively caused this to heppen this week.

I am not sure why we should have a symlink in fedora's ingested datastream path at all. Can some one please
look at this and help determine if the symlink file should just be purged and if so where the real PTIF-13.0 and
PTIF-13.1 are supposed to be?

rep-prod:/mellon/datastreams/2016/0917/01/31 # ls -alt
total 35953
drwxr-xr-x 2 nobody nobody 1536 Feb 15 15:49 .
lrwxrwxrwx 1 nobody nobody 75 Sep 26 13:54 rutgers-lib_41121+PTIF-13+PTIF-13.0 -> /repository/datastreams/2016/0917/01/31/rutgers-lib_41121+PTIF-13+PTIF-13.0
lrwxrwxrwx 1 nobody nobody 75 Sep 26 12:00 rutgers-lib_41121+PTIF-13+PTIF-13.1 -> /repository/datastreams/2016/0917/01/31/rutgers-lib_41121+PTIF-13+PTIF-13.1
drwxr-xr-x 61 nobody nobody 1180 Sep 17 01:59 ..
-rw-r--r-- 1 nobody nobody 342575 Sep 17 01:32 rutgers-lib_41126+PTIF-4+PTIF-4.0
-rw-r--r-- 1 nobody nobody 1048245 Sep 17 01:32 rutgers-lib_41126+PTIF-3+PTIF-3.0
-rw-r--r-- 1 nobody nobody 1858613 Sep 17 01:31 rutgers-lib_41126+PTIF-2+PTIF-2.0
-rw-r--r-- 1 nobody nobody 1667123 Sep 17 01:31 rutgers-lib_41126+PTIF-1+PTIF-1.0
-rw-r--r-- 1 nobody nobody 811659 Sep 17 01:31 rutgers-lib_41124+PTIF-7+PTIF-7.0
-rw-r--r-- 1 nobody nobody 602065 Sep 17 01:31 rutgers-lib_41124+PTIF-6+PTIF-6.0
-rw-r--r-- 1 nobody nobody 352893 Sep 17 01:31 rutgers-lib_41124+PTIF-5+PTIF-5.0
-rw-r--r-- 1 nobody nobody 169735 Sep 17 01:31 rutgers-lib_41124+PTIF-4+PTIF-4.0
-rw-r--r-- 1 nobody nobody 695223 Sep 17 01:31 rutgers-lib_41124+PTIF-3+PTIF-3.0
-rw-r--r-- 1 nobody nobody 1001201 Sep 17 01:31 rutgers-lib_41124+PTIF-2+PTIF-2.0
-rw-r--r-- 1 nobody nobody 804225 Sep 17 01:31 rutgers-lib_41124+PTIF-1+PTIF-1.0
-rw-r--r-- 1 nobody nobody 836618 Sep 17 01:31 rutgers-lib_41122+PTIF-7+PTIF-7.0
-rw-r--r-- 1 nobody nobody 675670 Sep 17 01:31 rutgers-lib_41122+PTIF-6+PTIF-6.0
-rw-r--r-- 1 nobody nobody 440422 Sep 17 01:31 rutgers-lib_41122+PTIF-5+PTIF-5.0
-rw-r--r-- 1 nobody nobody 166618 Sep 17 01:31 rutgers-lib_41122+PTIF-4+PTIF-4.0
-rw-r--r-- 1 nobody nobody 473084 Sep 17 01:31 rutgers-lib_41122+PTIF-3+PTIF-3.0
-rw-r--r-- 1 nobody nobody 727280 Sep 17 01:31 rutgers-lib_41122+PTIF-2+PTIF-2.0
-rw-r--r-- 1 nobody nobody 892624 Sep 17 01:31 rutgers-lib_41122+PTIF-1+PTIF-1.0
-rw-r--r-- 1 nobody nobody 461048 Sep 17 01:31 rutgers-lib_41121+PTIF-9+PTIF-9.0
-rw-r--r-- 1 nobody nobody 235510 Sep 17 01:31 rutgers-lib_41121+PTIF-7+PTIF-7.0
-rw-r--r-- 1 nobody nobody 571816 Sep 17 01:31 rutgers-lib_41121+PTIF-5+PTIF-5.0
-rw-r--r-- 1 nobody nobody 768432 Sep 17 01:31 rutgers-lib_41121+PTIF-3+PTIF-3.0
-rw-r--r-- 1 nobody nobody 677102 Sep 17 01:31 rutgers-lib_41121+PTIF-11+PTIF-11.0
-rw-r--r-- 1 nobody nobody 647046 Sep 17 01:31 rutgers-lib_41121+PTIF-1+PTIF-1.0

rep-prod:/mellon/datastreams/2016/0926/13/53 # ls -alt
total 6
drwxr-xr-x 2 nobody nobody 53 Sep 26 13:54 .
lrwxrwxrwx 1 nobody nobody 75 Sep 26 13:54 rutgers-lib_41121+PTIF-11+PTIF-11.2 -> /repository/datastreams/2016/0926/13/53/rutgers-lib_41121+PTIF-11+PTIF-11.2
drwxr-xr-x 4 nobody nobody 40 Sep 26 13:54 ..

The other rutgers-lib number that shows this

rep-prod:/mellon/datastreams/2016/0917/10/39 # ls -alt
total 46587
drwxr-xr-x 2 nobody nobody 510 Feb 15 16:13 .
lrwxrwxrwx 1 nobody nobody 73 Feb 15 16:13 rutgers-lib_46533+PTIF-1+PTIF-1.1 -> /repository/datastreams/2016/0917/10/39/rutgers-lib_46533+PTIF-1+PTIF-1.1
drwxr-xr-x 62 nobody nobody 1200 Sep 17 10:59 ..

#33

Hi Dave,

For what it might be worth, here's part of the audit trail for one of these objects in late September.

Audit 234 Description: Deleting duplicate datastream versions . . . Purged datastream (ID=PTIF-1), versions ranging from 2016-09-17T01:31:36.438Z to 2016-09-17T01:31:36.438Z. This resulted in the permanent removal of 1 datastream version(s) (2016-09-17T01:31:36.438Z) and all associated audit records.
Action: purgeDatastream
DSID: PTIF-1
Date: 2017-02-15T20:49:52.735Z

Audit 235 Description: Deleting duplicate datastream versions . . . Purged datastream (ID=PTIF-3), versions ranging from 2016-09-17T01:31:41.752Z to 2016-09-17T01:31:41.752Z. This resulted in the permanent removal of 1 datastream version(s) (2016-09-17T01:31:41.752Z) and all associated audit records.
Action: purgeDatastream
DSID: PTIF-3
Date: 2017-02-15T20:49:52.865Z

Audit 236 Description: Deleting duplicate datastream versions . . . Purged datastream (ID=PTIF-5), versions ranging from 2016-09-17T01:31:43.110Z to 2016-09-17T01:31:43.110Z. This resulted in the permanent removal of 1 datastream version(s) (2016-09-17T01:31:43.110Z) and all associated audit records.
Action: purgeDatastream
DSID: PTIF-5
Date: 2017-02-15T20:49:53.003Z

Audit 237 Description: Deleting duplicate datastream versions . . . Purged datastream (ID=PTIF-7), versions ranging from 2016-09-17T01:31:44.310Z to 2016-09-17T01:31:44.310Z. This resulted in the permanent removal of 1 datastream version(s) (2016-09-17T01:31:44.310Z) and all associated audit records.
Action: purgeDatastream
DSID: PTIF-7
Date: 2017-02-15T20:49:53.101Z

Audit 238 Description: Deleting duplicate datastream versions . . . Purged datastream (ID=PTIF-9), versions ranging from 2016-09-17T01:31:45.545Z to 2016-09-17T01:31:45.545Z. This resulted in the permanent removal of 1 datastream version(s) (2016-09-17T01:31:45.545Z) and all associated audit records.
Action: purgeDatastream
DSID: PTIF-9
Date: 2017-02-15T20:49:53.230Z

Audit 239 Description: Deleting duplicate datastream versions . . . Purged datastream (ID=PTIF-11), versions ranging from 2016-09-26T13:54:08.283Z to 2016-09-26T13:54:08.283Z. This resulted in the permanent removal of 1 datastream version(s) (2016-09-26T13:54:08.283Z) and all associated audit records.
Action: purgeDatastream
DSID: PTIF-11
Date: 2017-02-15T20:49:53.462Z

#34

The object in question is rutgers-lib:41121

#35

Please let me know if anyone wants to do further investigation
of the issue with the symlinks for PTIF datastreams reported
above for these two objects rutgers-lib_41121 and rutgers-lib_46533

From my perspective symlinks like these should never be present in the
live fedora datastream path (especially pointing to themselves) and we
should be able to delete them from that location without effecting the
real objects and their datastreams.

#36

The first dryrun on production for 2805 cmodel MAP objects has finished

It started at 2017-02-15 23:02:45
and ended 2017-02-17 09:31:00

7023.21 user
10812.23 system
34:29:13elapsed
14%CPU (0avgtext+0avgdata 331768maxresident)k
1041638824 inputs+1035896552 outputs (717major+42478869minor)pagefaults 0swaps

It generated the 4 symlink failures detailed above. Full reports are attached.

#37

This looks like a bigger set than one might have thought:
Total tar bytes: 530092273664
Total number of ARCH datastreams extracted: 3387
Total number of DARCH datastreams extracted: 575
Total extracted datastream bytes: 530077875584

Total number of objects looked at: 2805
triggs@rep-devel:/mellon/htdocs/dlr/EDIT> egrep SUCCESS untar-170215-230245-dryrun.txt | wc -l ## successful untars
1969
triggs@rep-devel:/mellon/htdocs/dlr/EDIT> egrep NOT untar-170215-230245-dryrun.txt | wc -l ## not necessary to process
833
triggs@rep-devel:/mellon/htdocs/dlr/EDIT> egrep BAD untar-170215-230245-dryrun.txt | wc -l ## BAD RARCH TARS, BAD REDIRECT TARS, or BAD TIFF
3
triggs@rep-devel:/mellon/htdocs/dlr/EDIT> egrep BAD untar-170215-230245-dryrun.txt
TOTALS BAD TIFF 197693440 1 0 197680930 Map rutgers-lib:36002
TOTALS BAD TIFF 200724480 1 0 200719880 Map rutgers-lib:36003
TOTALS BAD TIFF 214763520 0 0 214756406 Map rutgers-lib:32759

1969 + 833 + 3
2805

Only DARCHs found in tar: 126

#38

Here is the excel file for the dry run started 02/15/2017.

#39

The new tar is in place:
-rw-r--r-- 1 triggs developers 61440 Feb 17 15:10 /mellon/cvsroot/ndosha256-tar.tar
This includes the collname script and the changes to the automatically named log file as requested.

#40

I am here about the symlinks. I wonder if the image server, IIIF, might have created them. I see the issue was mainly with rutgers-lib:41121. I had an issue with this resource when running the delete duplicate job. Some manual deletion of "extra" datastreams had occurred between the time the creation job for the PTIFs was run and the duplicate removal job was run.

Perhaps a side effect of the manual datastream deletion was a strange off shoot that created from symlinks where the should not have been. I am not sure how to even try to replicate this, if it's even possible.

#41

I'm just wondering if we have resolved the symlink issue yet and are ready to re-run the dry run test?

#42

I have deleted the symlinks mentioned in this issue from the fedora datastreams directory,
so we should not get "Too many levels of symbolic links" message anymore.

We can resume this process next week. Is the next step to run this batch in real mode
or are we looking at all dryruns first?

#43

Hmmm. I seem to remember that we thought it might be good to try the dryrun once again first to see if the timing improved without the symlinks, but we could do the realrun if you think it's good to go otherwise.

#44

Status:active» Moved to JIRA

Back to top