Job ID = 1295061 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-03T02:57:39 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 56,571,472 reads read : 56,571,472 reads written : 56,571,472 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:18:18 56571472 reads; of these: 56571472 (100.00%) were unpaired; of these: 4876811 (8.62%) aligned 0 times 39392914 (69.63%) aligned exactly 1 time 12301747 (21.75%) aligned >1 times 91.38% overall alignment rate Time searching: 00:18:18 Overall time: 00:18:18 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 24 files... [bam_rmdupse_core] 8258729 / 51694661 = 0.1598 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 12:36:34: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 12:36:34: #1 read tag files... INFO @ Mon, 03 Jun 2019 12:36:34: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 12:36:35: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 12:36:35: #1 read tag files... INFO @ Mon, 03 Jun 2019 12:36:35: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 12:36:35: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 12:36:35: #1 read tag files... INFO @ Mon, 03 Jun 2019 12:36:35: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 12:36:42: 1000000 INFO @ Mon, 03 Jun 2019 12:36:43: 1000000 INFO @ Mon, 03 Jun 2019 12:36:45: 1000000 INFO @ Mon, 03 Jun 2019 12:36:49: 2000000 INFO @ Mon, 03 Jun 2019 12:36:52: 2000000 INFO @ Mon, 03 Jun 2019 12:36:54: 2000000 INFO @ Mon, 03 Jun 2019 12:36:57: 3000000 INFO @ Mon, 03 Jun 2019 12:37:00: 3000000 INFO @ Mon, 03 Jun 2019 12:37:03: 3000000 INFO @ Mon, 03 Jun 2019 12:37:04: 4000000 INFO @ Mon, 03 Jun 2019 12:37:09: 4000000 INFO @ Mon, 03 Jun 2019 12:37:12: 5000000 INFO @ Mon, 03 Jun 2019 12:37:13: 4000000 INFO @ Mon, 03 Jun 2019 12:37:17: 5000000 INFO @ Mon, 03 Jun 2019 12:37:19: 6000000 INFO @ Mon, 03 Jun 2019 12:37:22: 5000000 INFO @ Mon, 03 Jun 2019 12:37:26: 6000000 INFO @ Mon, 03 Jun 2019 12:37:26: 7000000 INFO @ Mon, 03 Jun 2019 12:37:32: 6000000 INFO @ Mon, 03 Jun 2019 12:37:33: 8000000 INFO @ Mon, 03 Jun 2019 12:37:34: 7000000 INFO @ Mon, 03 Jun 2019 12:37:41: 9000000 INFO @ Mon, 03 Jun 2019 12:37:41: 7000000 INFO @ Mon, 03 Jun 2019 12:37:43: 8000000 INFO @ Mon, 03 Jun 2019 12:37:48: 10000000 INFO @ Mon, 03 Jun 2019 12:37:51: 8000000 INFO @ Mon, 03 Jun 2019 12:37:51: 9000000 INFO @ Mon, 03 Jun 2019 12:37:55: 11000000 INFO @ Mon, 03 Jun 2019 12:38:00: 10000000 INFO @ Mon, 03 Jun 2019 12:38:00: 9000000 INFO @ Mon, 03 Jun 2019 12:38:03: 12000000 INFO @ Mon, 03 Jun 2019 12:38:08: 11000000 INFO @ Mon, 03 Jun 2019 12:38:10: 10000000 INFO @ Mon, 03 Jun 2019 12:38:13: 13000000 INFO @ Mon, 03 Jun 2019 12:38:17: 12000000 INFO @ Mon, 03 Jun 2019 12:38:19: 11000000 INFO @ Mon, 03 Jun 2019 12:38:23: 14000000 INFO @ Mon, 03 Jun 2019 12:38:25: 13000000 INFO @ Mon, 03 Jun 2019 12:38:28: 12000000 INFO @ Mon, 03 Jun 2019 12:38:32: 15000000 INFO @ Mon, 03 Jun 2019 12:38:34: 14000000 INFO @ Mon, 03 Jun 2019 12:38:38: 13000000 INFO @ Mon, 03 Jun 2019 12:38:40: 16000000 INFO @ Mon, 03 Jun 2019 12:38:42: 15000000 INFO @ Mon, 03 Jun 2019 12:38:47: 14000000 INFO @ Mon, 03 Jun 2019 12:38:49: 17000000 INFO @ Mon, 03 Jun 2019 12:38:51: 16000000 INFO @ Mon, 03 Jun 2019 12:38:56: 18000000 INFO @ Mon, 03 Jun 2019 12:38:56: 15000000 INFO @ Mon, 03 Jun 2019 12:38:59: 17000000 INFO @ Mon, 03 Jun 2019 12:39:03: 19000000 INFO @ Mon, 03 Jun 2019 12:39:06: 16000000 INFO @ Mon, 03 Jun 2019 12:39:08: 18000000 INFO @ Mon, 03 Jun 2019 12:39:10: 20000000 INFO @ Mon, 03 Jun 2019 12:39:16: 17000000 INFO @ Mon, 03 Jun 2019 12:39:16: 19000000 INFO @ Mon, 03 Jun 2019 12:39:18: 21000000 INFO @ Mon, 03 Jun 2019 12:39:24: 20000000 INFO @ Mon, 03 Jun 2019 12:39:25: 22000000 INFO @ Mon, 03 Jun 2019 12:39:25: 18000000 INFO @ Mon, 03 Jun 2019 12:39:32: 23000000 INFO @ Mon, 03 Jun 2019 12:39:33: 21000000 INFO @ Mon, 03 Jun 2019 12:39:34: 19000000 INFO @ Mon, 03 Jun 2019 12:39:39: 24000000 INFO @ Mon, 03 Jun 2019 12:39:41: 22000000 INFO @ Mon, 03 Jun 2019 12:39:44: 20000000 INFO @ Mon, 03 Jun 2019 12:39:48: 25000000 INFO @ Mon, 03 Jun 2019 12:39:50: 23000000 INFO @ Mon, 03 Jun 2019 12:39:54: 21000000 INFO @ Mon, 03 Jun 2019 12:39:56: 26000000 INFO @ Mon, 03 Jun 2019 12:39:58: 24000000 INFO @ Mon, 03 Jun 2019 12:40:03: 22000000 INFO @ Mon, 03 Jun 2019 12:40:04: 27000000 INFO @ Mon, 03 Jun 2019 12:40:06: 25000000 INFO @ Mon, 03 Jun 2019 12:40:12: 28000000 INFO @ Mon, 03 Jun 2019 12:40:12: 23000000 INFO @ Mon, 03 Jun 2019 12:40:15: 26000000 INFO @ Mon, 03 Jun 2019 12:40:19: 29000000 INFO @ Mon, 03 Jun 2019 12:40:22: 24000000 INFO @ Mon, 03 Jun 2019 12:40:23: 27000000 INFO @ Mon, 03 Jun 2019 12:40:27: 30000000 INFO @ Mon, 03 Jun 2019 12:40:31: 25000000 INFO @ Mon, 03 Jun 2019 12:40:31: 28000000 INFO @ Mon, 03 Jun 2019 12:40:34: 31000000 INFO @ Mon, 03 Jun 2019 12:40:40: 29000000 INFO @ Mon, 03 Jun 2019 12:40:40: 26000000 INFO @ Mon, 03 Jun 2019 12:40:42: 32000000 INFO @ Mon, 03 Jun 2019 12:40:48: 30000000 INFO @ Mon, 03 Jun 2019 12:40:50: 27000000 INFO @ Mon, 03 Jun 2019 12:40:50: 33000000 INFO @ Mon, 03 Jun 2019 12:40:56: 31000000 INFO @ Mon, 03 Jun 2019 12:40:57: 34000000 INFO @ Mon, 03 Jun 2019 12:40:59: 28000000 INFO @ Mon, 03 Jun 2019 12:41:04: 35000000 INFO @ Mon, 03 Jun 2019 12:41:05: 32000000 INFO @ Mon, 03 Jun 2019 12:41:08: 29000000 INFO @ Mon, 03 Jun 2019 12:41:12: 36000000 INFO @ Mon, 03 Jun 2019 12:41:13: 33000000 INFO @ Mon, 03 Jun 2019 12:41:17: 30000000 INFO @ Mon, 03 Jun 2019 12:41:19: 37000000 INFO @ Mon, 03 Jun 2019 12:41:21: 34000000 INFO @ Mon, 03 Jun 2019 12:41:26: 38000000 INFO @ Mon, 03 Jun 2019 12:41:26: 31000000 INFO @ Mon, 03 Jun 2019 12:41:30: 35000000 INFO @ Mon, 03 Jun 2019 12:41:33: 39000000 INFO @ Mon, 03 Jun 2019 12:41:35: 32000000 INFO @ Mon, 03 Jun 2019 12:41:38: 36000000 INFO @ Mon, 03 Jun 2019 12:41:41: 40000000 INFO @ Mon, 03 Jun 2019 12:41:44: 33000000 INFO @ Mon, 03 Jun 2019 12:41:46: 37000000 INFO @ Mon, 03 Jun 2019 12:41:48: 41000000 INFO @ Mon, 03 Jun 2019 12:41:53: 34000000 INFO @ Mon, 03 Jun 2019 12:41:54: 38000000 INFO @ Mon, 03 Jun 2019 12:41:55: 42000000 INFO @ Mon, 03 Jun 2019 12:42:02: 39000000 INFO @ Mon, 03 Jun 2019 12:42:02: 43000000 INFO @ Mon, 03 Jun 2019 12:42:03: 35000000 INFO @ Mon, 03 Jun 2019 12:42:06: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 12:42:06: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 12:42:06: #1 total tags in treatment: 43435932 INFO @ Mon, 03 Jun 2019 12:42:06: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 12:42:06: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 12:42:07: #1 tags after filtering in treatment: 43435932 INFO @ Mon, 03 Jun 2019 12:42:07: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 12:42:07: #1 finished! INFO @ Mon, 03 Jun 2019 12:42:07: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 12:42:07: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 12:42:10: #2 number of paired peaks: 27 WARNING @ Mon, 03 Jun 2019 12:42:10: Too few paired peaks (27) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 12:42:10: Process for pairing-model is terminated! INFO @ Mon, 03 Jun 2019 12:42:10: 40000000 cut: /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 12:42:12: 36000000 INFO @ Mon, 03 Jun 2019 12:42:18: 41000000 INFO @ Mon, 03 Jun 2019 12:42:21: 37000000 INFO @ Mon, 03 Jun 2019 12:42:26: 42000000 INFO @ Mon, 03 Jun 2019 12:42:30: 38000000 INFO @ Mon, 03 Jun 2019 12:42:35: 43000000 INFO @ Mon, 03 Jun 2019 12:42:39: 39000000 INFO @ Mon, 03 Jun 2019 12:42:39: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 12:42:39: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 12:42:39: #1 total tags in treatment: 43435932 INFO @ Mon, 03 Jun 2019 12:42:39: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 12:42:39: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 12:42:40: #1 tags after filtering in treatment: 43435932 INFO @ Mon, 03 Jun 2019 12:42:40: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 12:42:40: #1 finished! INFO @ Mon, 03 Jun 2019 12:42:40: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 12:42:40: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 12:42:43: #2 number of paired peaks: 27 WARNING @ Mon, 03 Jun 2019 12:42:43: Too few paired peaks (27) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 12:42:43: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 12:42:48: 40000000 INFO @ Mon, 03 Jun 2019 12:42:57: 41000000 INFO @ Mon, 03 Jun 2019 12:43:05: 42000000 INFO @ Mon, 03 Jun 2019 12:43:15: 43000000 INFO @ Mon, 03 Jun 2019 12:43:19: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 12:43:19: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 12:43:19: #1 total tags in treatment: 43435932 INFO @ Mon, 03 Jun 2019 12:43:19: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 12:43:19: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 12:43:20: #1 tags after filtering in treatment: 43435932 INFO @ Mon, 03 Jun 2019 12:43:20: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 12:43:20: #1 finished! INFO @ Mon, 03 Jun 2019 12:43:20: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 12:43:20: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 12:43:23: #2 number of paired peaks: 27 WARNING @ Mon, 03 Jun 2019 12:43:23: Too few paired peaks (27) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 12:43:23: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX3167255/SRX3167255.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。