Job ID = 4178619 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... spots read : 44,877,392 reads read : 44,877,392 reads written : 44,877,392 rm: cannot remove ‘[DSE]RR*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:01 Multiseed full-index search: 00:16:18 44877392 reads; of these: 44877392 (100.00%) were unpaired; of these: 2007243 (4.47%) aligned 0 times 31254489 (69.64%) aligned exactly 1 time 11615660 (25.88%) aligned >1 times 95.53% overall alignment rate Time searching: 00:16:19 Overall time: 00:16:19 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 20 files... [bam_rmdupse_core] 13532593 / 42870149 = 0.3157 in library ' ' BAM に変換しました。 Bed ファイルを作成中... INFO @ Thu, 05 Dec 2019 14:06:10: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Thu, 05 Dec 2019 14:06:10: #1 read tag files... INFO @ Thu, 05 Dec 2019 14:06:10: #1 read treatment tags... INFO @ Thu, 05 Dec 2019 14:06:16: 1000000 INFO @ Thu, 05 Dec 2019 14:06:22: 2000000 INFO @ Thu, 05 Dec 2019 14:06:28: 3000000 INFO @ Thu, 05 Dec 2019 14:06:35: 4000000 INFO @ Thu, 05 Dec 2019 14:06:39: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Thu, 05 Dec 2019 14:06:39: #1 read tag files... INFO @ Thu, 05 Dec 2019 14:06:39: #1 read treatment tags... INFO @ Thu, 05 Dec 2019 14:06:41: 5000000 INFO @ Thu, 05 Dec 2019 14:06:45: 1000000 INFO @ Thu, 05 Dec 2019 14:06:47: 6000000 INFO @ Thu, 05 Dec 2019 14:06:51: 2000000 INFO @ Thu, 05 Dec 2019 14:06:53: 7000000 INFO @ Thu, 05 Dec 2019 14:06:58: 3000000 INFO @ Thu, 05 Dec 2019 14:07:00: 8000000 INFO @ Thu, 05 Dec 2019 14:07:04: 4000000 BedGraph に変換中... INFO @ Thu, 05 Dec 2019 14:07:07: 9000000 INFO @ Thu, 05 Dec 2019 14:07:11: 5000000 INFO @ Thu, 05 Dec 2019 14:07:11: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Thu, 05 Dec 2019 14:07:11: #1 read tag files... INFO @ Thu, 05 Dec 2019 14:07:11: #1 read treatment tags... INFO @ Thu, 05 Dec 2019 14:07:14: 10000000 INFO @ Thu, 05 Dec 2019 14:07:18: 6000000 INFO @ Thu, 05 Dec 2019 14:07:18: 1000000 INFO @ Thu, 05 Dec 2019 14:07:22: 11000000 INFO @ Thu, 05 Dec 2019 14:07:24: 7000000 INFO @ Thu, 05 Dec 2019 14:07:25: 2000000 INFO @ Thu, 05 Dec 2019 14:07:29: 12000000 INFO @ Thu, 05 Dec 2019 14:07:30: 8000000 INFO @ Thu, 05 Dec 2019 14:07:32: 3000000 INFO @ Thu, 05 Dec 2019 14:07:36: 13000000 INFO @ Thu, 05 Dec 2019 14:07:37: 9000000 INFO @ Thu, 05 Dec 2019 14:07:39: 4000000 INFO @ Thu, 05 Dec 2019 14:07:43: 14000000 INFO @ Thu, 05 Dec 2019 14:07:43: 10000000 INFO @ Thu, 05 Dec 2019 14:07:46: 5000000 INFO @ Thu, 05 Dec 2019 14:07:49: 15000000 INFO @ Thu, 05 Dec 2019 14:07:49: 11000000 INFO @ Thu, 05 Dec 2019 14:07:52: 6000000 INFO @ Thu, 05 Dec 2019 14:07:56: 12000000 INFO @ Thu, 05 Dec 2019 14:07:56: 16000000 INFO @ Thu, 05 Dec 2019 14:07:59: 7000000 INFO @ Thu, 05 Dec 2019 14:08:02: 13000000 INFO @ Thu, 05 Dec 2019 14:08:02: 17000000 INFO @ Thu, 05 Dec 2019 14:08:05: 8000000 INFO @ Thu, 05 Dec 2019 14:08:08: 14000000 INFO @ Thu, 05 Dec 2019 14:08:08: 18000000 INFO @ Thu, 05 Dec 2019 14:08:12: 9000000 INFO @ Thu, 05 Dec 2019 14:08:14: 15000000 INFO @ Thu, 05 Dec 2019 14:08:15: 19000000 INFO @ Thu, 05 Dec 2019 14:08:19: 10000000 INFO @ Thu, 05 Dec 2019 14:08:21: 16000000 INFO @ Thu, 05 Dec 2019 14:08:21: 20000000 INFO @ Thu, 05 Dec 2019 14:08:25: 11000000 INFO @ Thu, 05 Dec 2019 14:08:27: 17000000 INFO @ Thu, 05 Dec 2019 14:08:28: 21000000 INFO @ Thu, 05 Dec 2019 14:08:32: 12000000 INFO @ Thu, 05 Dec 2019 14:08:34: 18000000 INFO @ Thu, 05 Dec 2019 14:08:34: 22000000 INFO @ Thu, 05 Dec 2019 14:08:39: 13000000 INFO @ Thu, 05 Dec 2019 14:08:41: 19000000 INFO @ Thu, 05 Dec 2019 14:08:41: 23000000 INFO @ Thu, 05 Dec 2019 14:08:45: 14000000 INFO @ Thu, 05 Dec 2019 14:08:47: 20000000 INFO @ Thu, 05 Dec 2019 14:08:48: 24000000 INFO @ Thu, 05 Dec 2019 14:08:52: 15000000 INFO @ Thu, 05 Dec 2019 14:08:54: 21000000 INFO @ Thu, 05 Dec 2019 14:08:55: 25000000 INFO @ Thu, 05 Dec 2019 14:08:59: 16000000 INFO @ Thu, 05 Dec 2019 14:09:01: 22000000 INFO @ Thu, 05 Dec 2019 14:09:01: 26000000 INFO @ Thu, 05 Dec 2019 14:09:06: 17000000 INFO @ Thu, 05 Dec 2019 14:09:08: 27000000 INFO @ Thu, 05 Dec 2019 14:09:08: 23000000 INFO @ Thu, 05 Dec 2019 14:09:13: 18000000 INFO @ Thu, 05 Dec 2019 14:09:15: 24000000 INFO @ Thu, 05 Dec 2019 14:09:15: 28000000 INFO @ Thu, 05 Dec 2019 14:09:20: 19000000 INFO @ Thu, 05 Dec 2019 14:09:21: 25000000 INFO @ Thu, 05 Dec 2019 14:09:22: 29000000 INFO @ Thu, 05 Dec 2019 14:09:24: #1 tag size is determined as 50 bps INFO @ Thu, 05 Dec 2019 14:09:24: #1 tag size = 50 INFO @ Thu, 05 Dec 2019 14:09:24: #1 total tags in treatment: 29337556 INFO @ Thu, 05 Dec 2019 14:09:24: #1 user defined the maximum tags... INFO @ Thu, 05 Dec 2019 14:09:24: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Thu, 05 Dec 2019 14:09:25: #1 tags after filtering in treatment: 29337556 INFO @ Thu, 05 Dec 2019 14:09:25: #1 Redundant rate of treatment: 0.00 INFO @ Thu, 05 Dec 2019 14:09:25: #1 finished! INFO @ Thu, 05 Dec 2019 14:09:25: #2 Build Peak Model... INFO @ Thu, 05 Dec 2019 14:09:25: #2 looking for paired plus/minus strand peaks... INFO @ Thu, 05 Dec 2019 14:09:26: 20000000 INFO @ Thu, 05 Dec 2019 14:09:27: #2 number of paired peaks: 0 WARNING @ Thu, 05 Dec 2019 14:09:27: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Thu, 05 Dec 2019 14:09:27: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Thu, 05 Dec 2019 14:09:27: 26000000 INFO @ Thu, 05 Dec 2019 14:09:32: 21000000 INFO @ Thu, 05 Dec 2019 14:09:33: 27000000 INFO @ Thu, 05 Dec 2019 14:09:39: 22000000 INFO @ Thu, 05 Dec 2019 14:09:40: 28000000 INFO @ Thu, 05 Dec 2019 14:09:45: 23000000 INFO @ Thu, 05 Dec 2019 14:09:46: 29000000 INFO @ Thu, 05 Dec 2019 14:09:48: #1 tag size is determined as 50 bps INFO @ Thu, 05 Dec 2019 14:09:48: #1 tag size = 50 INFO @ Thu, 05 Dec 2019 14:09:48: #1 total tags in treatment: 29337556 INFO @ Thu, 05 Dec 2019 14:09:48: #1 user defined the maximum tags... INFO @ Thu, 05 Dec 2019 14:09:48: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Thu, 05 Dec 2019 14:09:49: #1 tags after filtering in treatment: 29337556 INFO @ Thu, 05 Dec 2019 14:09:49: #1 Redundant rate of treatment: 0.00 INFO @ Thu, 05 Dec 2019 14:09:49: #1 finished! INFO @ Thu, 05 Dec 2019 14:09:49: #2 Build Peak Model... INFO @ Thu, 05 Dec 2019 14:09:49: #2 looking for paired plus/minus strand peaks... INFO @ Thu, 05 Dec 2019 14:09:51: #2 number of paired peaks: 0 WARNING @ Thu, 05 Dec 2019 14:09:51: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Thu, 05 Dec 2019 14:09:51: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Thu, 05 Dec 2019 14:09:51: 24000000 INFO @ Thu, 05 Dec 2019 14:09:57: 25000000 INFO @ Thu, 05 Dec 2019 14:10:02: 26000000 INFO @ Thu, 05 Dec 2019 14:10:08: 27000000 INFO @ Thu, 05 Dec 2019 14:10:14: 28000000 INFO @ Thu, 05 Dec 2019 14:10:20: 29000000 INFO @ Thu, 05 Dec 2019 14:10:22: #1 tag size is determined as 50 bps INFO @ Thu, 05 Dec 2019 14:10:22: #1 tag size = 50 INFO @ Thu, 05 Dec 2019 14:10:22: #1 total tags in treatment: 29337556 INFO @ Thu, 05 Dec 2019 14:10:22: #1 user defined the maximum tags... INFO @ Thu, 05 Dec 2019 14:10:22: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Thu, 05 Dec 2019 14:10:23: #1 tags after filtering in treatment: 29337556 INFO @ Thu, 05 Dec 2019 14:10:23: #1 Redundant rate of treatment: 0.00 INFO @ Thu, 05 Dec 2019 14:10:23: #1 finished! INFO @ Thu, 05 Dec 2019 14:10:23: #2 Build Peak Model... INFO @ Thu, 05 Dec 2019 14:10:23: #2 looking for paired plus/minus strand peaks... INFO @ Thu, 05 Dec 2019 14:10:24: #2 number of paired peaks: 0 WARNING @ Thu, 05 Dec 2019 14:10:24: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Thu, 05 Dec 2019 14:10:24: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX6468493/SRX6468493.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。