Job ID = 1309124 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... spots read : 58,628,307 reads read : 58,628,307 reads written : 58,628,307 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:23:14 58628307 reads; of these: 58628307 (100.00%) were unpaired; of these: 4494280 (7.67%) aligned 0 times 40514414 (69.10%) aligned exactly 1 time 13619613 (23.23%) aligned >1 times 92.33% overall alignment rate Time searching: 00:23:14 Overall time: 00:23:14 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 24 files... [bam_rmdupse_core] 8470870 / 54134027 = 0.1565 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Tue, 04 Jun 2019 00:09:38: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Tue, 04 Jun 2019 00:09:38: #1 read tag files... INFO @ Tue, 04 Jun 2019 00:09:38: #1 read treatment tags... INFO @ Tue, 04 Jun 2019 00:09:38: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Tue, 04 Jun 2019 00:09:38: #1 read tag files... INFO @ Tue, 04 Jun 2019 00:09:38: #1 read treatment tags... INFO @ Tue, 04 Jun 2019 00:09:38: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Tue, 04 Jun 2019 00:09:38: #1 read tag files... INFO @ Tue, 04 Jun 2019 00:09:38: #1 read treatment tags... INFO @ Tue, 04 Jun 2019 00:09:46: 1000000 INFO @ Tue, 04 Jun 2019 00:09:47: 1000000 INFO @ Tue, 04 Jun 2019 00:09:47: 1000000 INFO @ Tue, 04 Jun 2019 00:09:54: 2000000 INFO @ Tue, 04 Jun 2019 00:09:55: 2000000 INFO @ Tue, 04 Jun 2019 00:09:55: 2000000 INFO @ Tue, 04 Jun 2019 00:10:01: 3000000 INFO @ Tue, 04 Jun 2019 00:10:03: 3000000 INFO @ Tue, 04 Jun 2019 00:10:04: 3000000 INFO @ Tue, 04 Jun 2019 00:10:09: 4000000 INFO @ Tue, 04 Jun 2019 00:10:10: 4000000 INFO @ Tue, 04 Jun 2019 00:10:12: 4000000 INFO @ Tue, 04 Jun 2019 00:10:16: 5000000 INFO @ Tue, 04 Jun 2019 00:10:18: 5000000 INFO @ Tue, 04 Jun 2019 00:10:20: 5000000 INFO @ Tue, 04 Jun 2019 00:10:24: 6000000 INFO @ Tue, 04 Jun 2019 00:10:26: 6000000 INFO @ Tue, 04 Jun 2019 00:10:28: 6000000 INFO @ Tue, 04 Jun 2019 00:10:32: 7000000 INFO @ Tue, 04 Jun 2019 00:10:34: 7000000 INFO @ Tue, 04 Jun 2019 00:10:36: 7000000 INFO @ Tue, 04 Jun 2019 00:10:39: 8000000 INFO @ Tue, 04 Jun 2019 00:10:42: 8000000 INFO @ Tue, 04 Jun 2019 00:10:44: 8000000 INFO @ Tue, 04 Jun 2019 00:10:47: 9000000 INFO @ Tue, 04 Jun 2019 00:10:50: 9000000 INFO @ Tue, 04 Jun 2019 00:10:52: 9000000 INFO @ Tue, 04 Jun 2019 00:10:54: 10000000 INFO @ Tue, 04 Jun 2019 00:10:57: 10000000 INFO @ Tue, 04 Jun 2019 00:11:01: 10000000 INFO @ Tue, 04 Jun 2019 00:11:02: 11000000 INFO @ Tue, 04 Jun 2019 00:11:05: 11000000 INFO @ Tue, 04 Jun 2019 00:11:09: 12000000 INFO @ Tue, 04 Jun 2019 00:11:09: 11000000 INFO @ Tue, 04 Jun 2019 00:11:13: 12000000 INFO @ Tue, 04 Jun 2019 00:11:17: 13000000 INFO @ Tue, 04 Jun 2019 00:11:18: 12000000 INFO @ Tue, 04 Jun 2019 00:11:21: 13000000 INFO @ Tue, 04 Jun 2019 00:11:24: 14000000 INFO @ Tue, 04 Jun 2019 00:11:27: 13000000 INFO @ Tue, 04 Jun 2019 00:11:29: 14000000 INFO @ Tue, 04 Jun 2019 00:11:32: 15000000 INFO @ Tue, 04 Jun 2019 00:11:36: 14000000 INFO @ Tue, 04 Jun 2019 00:11:36: 15000000 INFO @ Tue, 04 Jun 2019 00:11:39: 16000000 INFO @ Tue, 04 Jun 2019 00:11:44: 16000000 INFO @ Tue, 04 Jun 2019 00:11:44: 15000000 INFO @ Tue, 04 Jun 2019 00:11:47: 17000000 INFO @ Tue, 04 Jun 2019 00:11:52: 17000000 INFO @ Tue, 04 Jun 2019 00:11:53: 16000000 INFO @ Tue, 04 Jun 2019 00:11:54: 18000000 INFO @ Tue, 04 Jun 2019 00:11:59: 18000000 INFO @ Tue, 04 Jun 2019 00:12:01: 19000000 INFO @ Tue, 04 Jun 2019 00:12:02: 17000000 INFO @ Tue, 04 Jun 2019 00:12:07: 19000000 INFO @ Tue, 04 Jun 2019 00:12:09: 20000000 INFO @ Tue, 04 Jun 2019 00:12:10: 18000000 INFO @ Tue, 04 Jun 2019 00:12:15: 20000000 INFO @ Tue, 04 Jun 2019 00:12:17: 21000000 INFO @ Tue, 04 Jun 2019 00:12:18: 19000000 INFO @ Tue, 04 Jun 2019 00:12:23: 21000000 INFO @ Tue, 04 Jun 2019 00:12:24: 22000000 INFO @ Tue, 04 Jun 2019 00:12:26: 20000000 INFO @ Tue, 04 Jun 2019 00:12:31: 22000000 INFO @ Tue, 04 Jun 2019 00:12:32: 23000000 INFO @ Tue, 04 Jun 2019 00:12:34: 21000000 INFO @ Tue, 04 Jun 2019 00:12:39: 23000000 INFO @ Tue, 04 Jun 2019 00:12:40: 24000000 INFO @ Tue, 04 Jun 2019 00:12:42: 22000000 INFO @ Tue, 04 Jun 2019 00:12:47: 24000000 INFO @ Tue, 04 Jun 2019 00:12:49: 25000000 INFO @ Tue, 04 Jun 2019 00:12:50: 23000000 INFO @ Tue, 04 Jun 2019 00:12:55: 25000000 INFO @ Tue, 04 Jun 2019 00:12:58: 26000000 INFO @ Tue, 04 Jun 2019 00:12:58: 24000000 INFO @ Tue, 04 Jun 2019 00:13:03: 26000000 INFO @ Tue, 04 Jun 2019 00:13:06: 25000000 INFO @ Tue, 04 Jun 2019 00:13:06: 27000000 INFO @ Tue, 04 Jun 2019 00:13:11: 27000000 INFO @ Tue, 04 Jun 2019 00:13:15: 26000000 INFO @ Tue, 04 Jun 2019 00:13:15: 28000000 INFO @ Tue, 04 Jun 2019 00:13:20: 28000000 INFO @ Tue, 04 Jun 2019 00:13:23: 27000000 INFO @ Tue, 04 Jun 2019 00:13:25: 29000000 INFO @ Tue, 04 Jun 2019 00:13:28: 29000000 INFO @ Tue, 04 Jun 2019 00:13:31: 28000000 INFO @ Tue, 04 Jun 2019 00:13:33: 30000000 INFO @ Tue, 04 Jun 2019 00:13:36: 30000000 INFO @ Tue, 04 Jun 2019 00:13:39: 29000000 INFO @ Tue, 04 Jun 2019 00:13:42: 31000000 INFO @ Tue, 04 Jun 2019 00:13:44: 31000000 INFO @ Tue, 04 Jun 2019 00:13:47: 30000000 INFO @ Tue, 04 Jun 2019 00:13:51: 32000000 INFO @ Tue, 04 Jun 2019 00:13:52: 32000000 INFO @ Tue, 04 Jun 2019 00:13:55: 31000000 INFO @ Tue, 04 Jun 2019 00:13:59: 33000000 INFO @ Tue, 04 Jun 2019 00:14:00: 33000000 INFO @ Tue, 04 Jun 2019 00:14:03: 32000000 INFO @ Tue, 04 Jun 2019 00:14:08: 34000000 INFO @ Tue, 04 Jun 2019 00:14:08: 34000000 INFO @ Tue, 04 Jun 2019 00:14:11: 33000000 INFO @ Tue, 04 Jun 2019 00:14:17: 35000000 INFO @ Tue, 04 Jun 2019 00:14:17: 35000000 INFO @ Tue, 04 Jun 2019 00:14:20: 34000000 INFO @ Tue, 04 Jun 2019 00:14:25: 36000000 INFO @ Tue, 04 Jun 2019 00:14:26: 36000000 INFO @ Tue, 04 Jun 2019 00:14:28: 35000000 INFO @ Tue, 04 Jun 2019 00:14:33: 37000000 INFO @ Tue, 04 Jun 2019 00:14:35: 37000000 INFO @ Tue, 04 Jun 2019 00:14:36: 36000000 INFO @ Tue, 04 Jun 2019 00:14:41: 38000000 INFO @ Tue, 04 Jun 2019 00:14:43: 38000000 INFO @ Tue, 04 Jun 2019 00:14:44: 37000000 INFO @ Tue, 04 Jun 2019 00:14:49: 39000000 INFO @ Tue, 04 Jun 2019 00:14:52: 39000000 INFO @ Tue, 04 Jun 2019 00:14:52: 38000000 INFO @ Tue, 04 Jun 2019 00:14:57: 40000000 INFO @ Tue, 04 Jun 2019 00:15:01: 40000000 INFO @ Tue, 04 Jun 2019 00:15:01: 39000000 INFO @ Tue, 04 Jun 2019 00:15:05: 41000000 INFO @ Tue, 04 Jun 2019 00:15:09: 41000000 INFO @ Tue, 04 Jun 2019 00:15:10: 40000000 INFO @ Tue, 04 Jun 2019 00:15:13: 42000000 INFO @ Tue, 04 Jun 2019 00:15:18: 42000000 INFO @ Tue, 04 Jun 2019 00:15:19: 41000000 INFO @ Tue, 04 Jun 2019 00:15:22: 43000000 INFO @ Tue, 04 Jun 2019 00:15:27: 43000000 INFO @ Tue, 04 Jun 2019 00:15:29: 42000000 INFO @ Tue, 04 Jun 2019 00:15:30: 44000000 INFO @ Tue, 04 Jun 2019 00:15:36: 44000000 INFO @ Tue, 04 Jun 2019 00:15:38: 43000000 INFO @ Tue, 04 Jun 2019 00:15:38: 45000000 INFO @ Tue, 04 Jun 2019 00:15:44: #1 tag size is determined as 50 bps INFO @ Tue, 04 Jun 2019 00:15:44: #1 tag size = 50 INFO @ Tue, 04 Jun 2019 00:15:44: #1 total tags in treatment: 45663157 INFO @ Tue, 04 Jun 2019 00:15:44: #1 user defined the maximum tags... INFO @ Tue, 04 Jun 2019 00:15:44: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Tue, 04 Jun 2019 00:15:44: 45000000 INFO @ Tue, 04 Jun 2019 00:15:45: #1 tags after filtering in treatment: 45663157 INFO @ Tue, 04 Jun 2019 00:15:45: #1 Redundant rate of treatment: 0.00 INFO @ Tue, 04 Jun 2019 00:15:45: #1 finished! INFO @ Tue, 04 Jun 2019 00:15:45: #2 Build Peak Model... INFO @ Tue, 04 Jun 2019 00:15:45: #2 looking for paired plus/minus strand peaks... INFO @ Tue, 04 Jun 2019 00:15:47: 44000000 INFO @ Tue, 04 Jun 2019 00:15:48: #2 number of paired peaks: 0 WARNING @ Tue, 04 Jun 2019 00:15:48: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Tue, 04 Jun 2019 00:15:48: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Tue, 04 Jun 2019 00:15:50: #1 tag size is determined as 50 bps INFO @ Tue, 04 Jun 2019 00:15:50: #1 tag size = 50 INFO @ Tue, 04 Jun 2019 00:15:50: #1 total tags in treatment: 45663157 INFO @ Tue, 04 Jun 2019 00:15:50: #1 user defined the maximum tags... INFO @ Tue, 04 Jun 2019 00:15:50: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Tue, 04 Jun 2019 00:15:51: #1 tags after filtering in treatment: 45663157 INFO @ Tue, 04 Jun 2019 00:15:51: #1 Redundant rate of treatment: 0.00 INFO @ Tue, 04 Jun 2019 00:15:51: #1 finished! INFO @ Tue, 04 Jun 2019 00:15:51: #2 Build Peak Model... INFO @ Tue, 04 Jun 2019 00:15:51: #2 looking for paired plus/minus strand peaks... INFO @ Tue, 04 Jun 2019 00:15:55: #2 number of paired peaks: 0 WARNING @ Tue, 04 Jun 2019 00:15:55: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Tue, 04 Jun 2019 00:15:55: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Tue, 04 Jun 2019 00:15:55: 45000000 INFO @ Tue, 04 Jun 2019 00:16:02: #1 tag size is determined as 50 bps INFO @ Tue, 04 Jun 2019 00:16:02: #1 tag size = 50 INFO @ Tue, 04 Jun 2019 00:16:02: #1 total tags in treatment: 45663157 INFO @ Tue, 04 Jun 2019 00:16:02: #1 user defined the maximum tags... INFO @ Tue, 04 Jun 2019 00:16:02: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Tue, 04 Jun 2019 00:16:03: #1 tags after filtering in treatment: 45663157 INFO @ Tue, 04 Jun 2019 00:16:03: #1 Redundant rate of treatment: 0.00 INFO @ Tue, 04 Jun 2019 00:16:03: #1 finished! INFO @ Tue, 04 Jun 2019 00:16:03: #2 Build Peak Model... INFO @ Tue, 04 Jun 2019 00:16:03: #2 looking for paired plus/minus strand peaks... INFO @ Tue, 04 Jun 2019 00:16:06: #2 number of paired peaks: 0 WARNING @ Tue, 04 Jun 2019 00:16:06: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Tue, 04 Jun 2019 00:16:06: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX751571/SRX751571.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。