Job ID = 11632682 sra ファイルのダウンロード中... Completed: 877904K bytes transferred in 11 seconds (620363K bits/sec), in 1 file. sra ファイルのダウンロードが完了しました。 Read layout: SINGLE fastq に変換中... Read 58273554 spots for /home/okishinya/chipatlas/results/rn6/SRX3782442/SRR6826260.sra Written 58273554 spots for /home/okishinya/chipatlas/results/rn6/SRX3782442/SRR6826260.sra rm: cannot remove `[DSE]RX*': そのようなファイルやディレクトリはありません rm: cannot remove `[DSE]RR*.fastq': そのようなファイルやディレクトリはありません fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:01 Time loading mirror index: 00:00:01 Multiseed full-index search: 00:22:43 58273554 reads; of these: 58273554 (100.00%) were unpaired; of these: 3673815 (6.30%) aligned 0 times 45095929 (77.39%) aligned exactly 1 time 9503810 (16.31%) aligned >1 times 93.70% overall alignment rate Time searching: 00:22:46 Overall time: 00:22:46 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 953 sequences. [bam_sort_core] merging from 16 files... [bam_rmdupse_core] 6587949 / 54599739 = 0.1207 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Fri, 15 Feb 2019 06:12:11: # Command line: callpeak -t SRX3782442.bam -f BAM -g 2.15e9 -n SRX3782442.05 -q 1e-05 # ARGUMENTS LIST: # name = SRX3782442.05 # format = BAM # ChIP-seq file = ['SRX3782442.bam'] # control file = None # effective genome size = 2.15e+09 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Fri, 15 Feb 2019 06:12:11: #1 read tag files... INFO @ Fri, 15 Feb 2019 06:12:11: #1 read treatment tags... INFO @ Fri, 15 Feb 2019 06:12:11: # Command line: callpeak -t SRX3782442.bam -f BAM -g 2.15e9 -n SRX3782442.10 -q 1e-10 # ARGUMENTS LIST: # name = SRX3782442.10 # format = BAM # ChIP-seq file = ['SRX3782442.bam'] # control file = None # effective genome size = 2.15e+09 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Fri, 15 Feb 2019 06:12:11: #1 read tag files... INFO @ Fri, 15 Feb 2019 06:12:11: #1 read treatment tags... INFO @ Fri, 15 Feb 2019 06:12:12: # Command line: callpeak -t SRX3782442.bam -f BAM -g 2.15e9 -n SRX3782442.20 -q 1e-20 # ARGUMENTS LIST: # name = SRX3782442.20 # format = BAM # ChIP-seq file = ['SRX3782442.bam'] # control file = None # effective genome size = 2.15e+09 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Fri, 15 Feb 2019 06:12:12: #1 read tag files... INFO @ Fri, 15 Feb 2019 06:12:12: #1 read treatment tags... INFO @ Fri, 15 Feb 2019 06:12:18: 1000000 INFO @ Fri, 15 Feb 2019 06:12:18: 1000000 INFO @ Fri, 15 Feb 2019 06:12:18: 1000000 INFO @ Fri, 15 Feb 2019 06:12:25: 2000000 INFO @ Fri, 15 Feb 2019 06:12:25: 2000000 INFO @ Fri, 15 Feb 2019 06:12:25: 2000000 INFO @ Fri, 15 Feb 2019 06:12:31: 3000000 INFO @ Fri, 15 Feb 2019 06:12:32: 3000000 INFO @ Fri, 15 Feb 2019 06:12:32: 3000000 INFO @ Fri, 15 Feb 2019 06:12:38: 4000000 INFO @ Fri, 15 Feb 2019 06:12:39: 4000000 INFO @ Fri, 15 Feb 2019 06:12:39: 4000000 INFO @ Fri, 15 Feb 2019 06:12:45: 5000000 INFO @ Fri, 15 Feb 2019 06:12:46: 5000000 INFO @ Fri, 15 Feb 2019 06:12:46: 5000000 INFO @ Fri, 15 Feb 2019 06:12:51: 6000000 INFO @ Fri, 15 Feb 2019 06:12:53: 6000000 INFO @ Fri, 15 Feb 2019 06:12:53: 6000000 INFO @ Fri, 15 Feb 2019 06:12:58: 7000000 INFO @ Fri, 15 Feb 2019 06:13:00: 7000000 INFO @ Fri, 15 Feb 2019 06:13:00: 7000000 INFO @ Fri, 15 Feb 2019 06:13:05: 8000000 INFO @ Fri, 15 Feb 2019 06:13:07: 8000000 INFO @ Fri, 15 Feb 2019 06:13:07: 8000000 INFO @ Fri, 15 Feb 2019 06:13:12: 9000000 INFO @ Fri, 15 Feb 2019 06:13:14: 9000000 INFO @ Fri, 15 Feb 2019 06:13:14: 9000000 INFO @ Fri, 15 Feb 2019 06:13:19: 10000000 INFO @ Fri, 15 Feb 2019 06:13:21: 10000000 INFO @ Fri, 15 Feb 2019 06:13:21: 10000000 INFO @ Fri, 15 Feb 2019 06:13:25: 11000000 INFO @ Fri, 15 Feb 2019 06:13:27: 11000000 INFO @ Fri, 15 Feb 2019 06:13:28: 11000000 INFO @ Fri, 15 Feb 2019 06:13:32: 12000000 INFO @ Fri, 15 Feb 2019 06:13:34: 12000000 INFO @ Fri, 15 Feb 2019 06:13:35: 12000000 INFO @ Fri, 15 Feb 2019 06:13:39: 13000000 INFO @ Fri, 15 Feb 2019 06:13:41: 13000000 INFO @ Fri, 15 Feb 2019 06:13:42: 13000000 INFO @ Fri, 15 Feb 2019 06:13:45: 14000000 INFO @ Fri, 15 Feb 2019 06:13:48: 14000000 INFO @ Fri, 15 Feb 2019 06:13:49: 14000000 INFO @ Fri, 15 Feb 2019 06:13:52: 15000000 INFO @ Fri, 15 Feb 2019 06:13:55: 15000000 INFO @ Fri, 15 Feb 2019 06:13:56: 15000000 INFO @ Fri, 15 Feb 2019 06:13:59: 16000000 INFO @ Fri, 15 Feb 2019 06:14:02: 16000000 INFO @ Fri, 15 Feb 2019 06:14:03: 16000000 INFO @ Fri, 15 Feb 2019 06:14:06: 17000000 INFO @ Fri, 15 Feb 2019 06:14:09: 17000000 INFO @ Fri, 15 Feb 2019 06:14:10: 17000000 INFO @ Fri, 15 Feb 2019 06:14:13: 18000000 INFO @ Fri, 15 Feb 2019 06:14:17: 18000000 INFO @ Fri, 15 Feb 2019 06:14:18: 18000000 INFO @ Fri, 15 Feb 2019 06:14:19: 19000000 INFO @ Fri, 15 Feb 2019 06:14:24: 19000000 INFO @ Fri, 15 Feb 2019 06:14:25: 19000000 INFO @ Fri, 15 Feb 2019 06:14:26: 20000000 INFO @ Fri, 15 Feb 2019 06:14:31: 20000000 INFO @ Fri, 15 Feb 2019 06:14:32: 20000000 INFO @ Fri, 15 Feb 2019 06:14:33: 21000000 INFO @ Fri, 15 Feb 2019 06:14:38: 21000000 INFO @ Fri, 15 Feb 2019 06:14:39: 21000000 INFO @ Fri, 15 Feb 2019 06:14:40: 22000000 INFO @ Fri, 15 Feb 2019 06:14:45: 22000000 INFO @ Fri, 15 Feb 2019 06:14:46: 22000000 INFO @ Fri, 15 Feb 2019 06:14:47: 23000000 INFO @ Fri, 15 Feb 2019 06:14:52: 23000000 INFO @ Fri, 15 Feb 2019 06:14:53: 23000000 INFO @ Fri, 15 Feb 2019 06:14:54: 24000000 INFO @ Fri, 15 Feb 2019 06:14:59: 24000000 INFO @ Fri, 15 Feb 2019 06:15:00: 24000000 INFO @ Fri, 15 Feb 2019 06:15:01: 25000000 INFO @ Fri, 15 Feb 2019 06:15:06: 25000000 INFO @ Fri, 15 Feb 2019 06:15:07: 25000000 INFO @ Fri, 15 Feb 2019 06:15:07: 26000000 INFO @ Fri, 15 Feb 2019 06:15:13: 26000000 INFO @ Fri, 15 Feb 2019 06:15:14: 26000000 INFO @ Fri, 15 Feb 2019 06:15:14: 27000000 INFO @ Fri, 15 Feb 2019 06:15:20: 27000000 INFO @ Fri, 15 Feb 2019 06:15:21: 27000000 INFO @ Fri, 15 Feb 2019 06:15:21: 28000000 INFO @ Fri, 15 Feb 2019 06:15:27: 28000000 INFO @ Fri, 15 Feb 2019 06:15:28: 28000000 INFO @ Fri, 15 Feb 2019 06:15:28: 29000000 INFO @ Fri, 15 Feb 2019 06:15:34: 29000000 INFO @ Fri, 15 Feb 2019 06:15:34: 30000000 INFO @ Fri, 15 Feb 2019 06:15:34: 29000000 INFO @ Fri, 15 Feb 2019 06:15:41: 30000000 INFO @ Fri, 15 Feb 2019 06:15:41: 31000000 INFO @ Fri, 15 Feb 2019 06:15:41: 30000000 INFO @ Fri, 15 Feb 2019 06:15:48: 31000000 INFO @ Fri, 15 Feb 2019 06:15:48: 32000000 INFO @ Fri, 15 Feb 2019 06:15:48: 31000000 INFO @ Fri, 15 Feb 2019 06:15:55: 33000000 INFO @ Fri, 15 Feb 2019 06:15:55: 32000000 INFO @ Fri, 15 Feb 2019 06:15:55: 32000000 INFO @ Fri, 15 Feb 2019 06:16:02: 34000000 INFO @ Fri, 15 Feb 2019 06:16:02: 33000000 INFO @ Fri, 15 Feb 2019 06:16:02: 33000000 INFO @ Fri, 15 Feb 2019 06:16:08: 35000000 INFO @ Fri, 15 Feb 2019 06:16:09: 34000000 INFO @ Fri, 15 Feb 2019 06:16:09: 34000000 INFO @ Fri, 15 Feb 2019 06:16:15: 36000000 INFO @ Fri, 15 Feb 2019 06:16:15: 35000000 INFO @ Fri, 15 Feb 2019 06:16:15: 35000000 INFO @ Fri, 15 Feb 2019 06:16:22: 37000000 INFO @ Fri, 15 Feb 2019 06:16:22: 36000000 INFO @ Fri, 15 Feb 2019 06:16:22: 36000000 INFO @ Fri, 15 Feb 2019 06:16:29: 38000000 INFO @ Fri, 15 Feb 2019 06:16:29: 37000000 INFO @ Fri, 15 Feb 2019 06:16:29: 37000000 INFO @ Fri, 15 Feb 2019 06:16:35: 39000000 INFO @ Fri, 15 Feb 2019 06:16:36: 38000000 INFO @ Fri, 15 Feb 2019 06:16:36: 38000000 INFO @ Fri, 15 Feb 2019 06:16:42: 40000000 INFO @ Fri, 15 Feb 2019 06:16:42: 39000000 INFO @ Fri, 15 Feb 2019 06:16:43: 39000000 INFO @ Fri, 15 Feb 2019 06:16:49: 41000000 INFO @ Fri, 15 Feb 2019 06:16:49: 40000000 INFO @ Fri, 15 Feb 2019 06:16:49: 40000000 INFO @ Fri, 15 Feb 2019 06:16:56: 42000000 INFO @ Fri, 15 Feb 2019 06:16:56: 41000000 INFO @ Fri, 15 Feb 2019 06:16:57: 41000000 INFO @ Fri, 15 Feb 2019 06:17:03: 43000000 INFO @ Fri, 15 Feb 2019 06:17:03: 42000000 INFO @ Fri, 15 Feb 2019 06:17:04: 42000000 INFO @ Fri, 15 Feb 2019 06:17:10: 44000000 INFO @ Fri, 15 Feb 2019 06:17:10: 43000000 INFO @ Fri, 15 Feb 2019 06:17:11: 43000000 INFO @ Fri, 15 Feb 2019 06:17:17: 45000000 INFO @ Fri, 15 Feb 2019 06:17:17: 44000000 INFO @ Fri, 15 Feb 2019 06:17:18: 44000000 INFO @ Fri, 15 Feb 2019 06:17:24: 46000000 INFO @ Fri, 15 Feb 2019 06:17:25: 45000000 INFO @ Fri, 15 Feb 2019 06:17:25: 45000000 INFO @ Fri, 15 Feb 2019 06:17:31: 47000000 INFO @ Fri, 15 Feb 2019 06:17:32: 46000000 INFO @ Fri, 15 Feb 2019 06:17:32: 46000000 INFO @ Fri, 15 Feb 2019 06:17:38: 48000000 INFO @ Fri, 15 Feb 2019 06:17:38: #1 tag size is determined as 40 bps INFO @ Fri, 15 Feb 2019 06:17:38: #1 tag size = 40 INFO @ Fri, 15 Feb 2019 06:17:38: #1 total tags in treatment: 48011790 INFO @ Fri, 15 Feb 2019 06:17:38: #1 user defined the maximum tags... INFO @ Fri, 15 Feb 2019 06:17:38: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Fri, 15 Feb 2019 06:17:39: 47000000 INFO @ Fri, 15 Feb 2019 06:17:39: 47000000 INFO @ Fri, 15 Feb 2019 06:17:40: #1 tags after filtering in treatment: 48011686 INFO @ Fri, 15 Feb 2019 06:17:40: #1 Redundant rate of treatment: 0.00 INFO @ Fri, 15 Feb 2019 06:17:40: #1 finished! INFO @ Fri, 15 Feb 2019 06:17:40: #2 Build Peak Model... INFO @ Fri, 15 Feb 2019 06:17:40: #2 looking for paired plus/minus strand peaks... INFO @ Fri, 15 Feb 2019 06:17:45: 48000000 INFO @ Fri, 15 Feb 2019 06:17:46: #1 tag size is determined as 40 bps INFO @ Fri, 15 Feb 2019 06:17:46: #1 tag size = 40 INFO @ Fri, 15 Feb 2019 06:17:46: #1 total tags in treatment: 48011790 INFO @ Fri, 15 Feb 2019 06:17:46: #1 user defined the maximum tags... INFO @ Fri, 15 Feb 2019 06:17:46: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Fri, 15 Feb 2019 06:17:46: 48000000 INFO @ Fri, 15 Feb 2019 06:17:46: #1 tag size is determined as 40 bps INFO @ Fri, 15 Feb 2019 06:17:46: #1 tag size = 40 INFO @ Fri, 15 Feb 2019 06:17:46: #1 total tags in treatment: 48011790 INFO @ Fri, 15 Feb 2019 06:17:46: #1 user defined the maximum tags... INFO @ Fri, 15 Feb 2019 06:17:46: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Fri, 15 Feb 2019 06:17:47: #1 tags after filtering in treatment: 48011686 INFO @ Fri, 15 Feb 2019 06:17:47: #1 Redundant rate of treatment: 0.00 INFO @ Fri, 15 Feb 2019 06:17:47: #1 finished! INFO @ Fri, 15 Feb 2019 06:17:47: #2 Build Peak Model... INFO @ Fri, 15 Feb 2019 06:17:47: #2 looking for paired plus/minus strand peaks... INFO @ Fri, 15 Feb 2019 06:17:48: #1 tags after filtering in treatment: 48011686 INFO @ Fri, 15 Feb 2019 06:17:48: #1 Redundant rate of treatment: 0.00 INFO @ Fri, 15 Feb 2019 06:17:48: #1 finished! INFO @ Fri, 15 Feb 2019 06:17:48: #2 Build Peak Model... INFO @ Fri, 15 Feb 2019 06:17:48: #2 looking for paired plus/minus strand peaks... INFO @ Fri, 15 Feb 2019 06:17:49: #2 number of paired peaks: 109994 INFO @ Fri, 15 Feb 2019 06:17:49: start model_add_line... INFO @ Fri, 15 Feb 2019 06:17:50: start X-correlation... INFO @ Fri, 15 Feb 2019 06:17:50: end of X-cor INFO @ Fri, 15 Feb 2019 06:17:50: #2 finished! INFO @ Fri, 15 Feb 2019 06:17:50: #2 predicted fragment length is 274 bps INFO @ Fri, 15 Feb 2019 06:17:50: #2 alternative fragment length(s) may be 274 bps INFO @ Fri, 15 Feb 2019 06:17:50: #2.2 Generate R script for model : SRX3782442.05_model.r INFO @ Fri, 15 Feb 2019 06:17:50: #3 Call peaks... INFO @ Fri, 15 Feb 2019 06:17:50: #3 Pre-compute pvalue-qvalue table... INFO @ Fri, 15 Feb 2019 06:17:57: #2 number of paired peaks: 109994 INFO @ Fri, 15 Feb 2019 06:17:57: start model_add_line... INFO @ Fri, 15 Feb 2019 06:17:57: #2 number of paired peaks: 109994 INFO @ Fri, 15 Feb 2019 06:17:57: start model_add_line... INFO @ Fri, 15 Feb 2019 06:17:58: start X-correlation... INFO @ Fri, 15 Feb 2019 06:17:58: end of X-cor INFO @ Fri, 15 Feb 2019 06:17:58: #2 finished! INFO @ Fri, 15 Feb 2019 06:17:58: #2 predicted fragment length is 274 bps INFO @ Fri, 15 Feb 2019 06:17:58: #2 alternative fragment length(s) may be 274 bps INFO @ Fri, 15 Feb 2019 06:17:58: #2.2 Generate R script for model : SRX3782442.20_model.r INFO @ Fri, 15 Feb 2019 06:17:58: #3 Call peaks... INFO @ Fri, 15 Feb 2019 06:17:58: #3 Pre-compute pvalue-qvalue table... INFO @ Fri, 15 Feb 2019 06:17:58: start X-correlation... INFO @ Fri, 15 Feb 2019 06:17:58: end of X-cor INFO @ Fri, 15 Feb 2019 06:17:58: #2 finished! INFO @ Fri, 15 Feb 2019 06:17:58: #2 predicted fragment length is 274 bps INFO @ Fri, 15 Feb 2019 06:17:58: #2 alternative fragment length(s) may be 274 bps INFO @ Fri, 15 Feb 2019 06:17:58: #2.2 Generate R script for model : SRX3782442.10_model.r INFO @ Fri, 15 Feb 2019 06:17:58: #3 Call peaks... INFO @ Fri, 15 Feb 2019 06:17:58: #3 Pre-compute pvalue-qvalue table... INFO @ Fri, 15 Feb 2019 06:20:10: #3 Call peaks for each chromosome... INFO @ Fri, 15 Feb 2019 06:20:13: #3 Call peaks for each chromosome... INFO @ Fri, 15 Feb 2019 06:20:14: #3 Call peaks for each chromosome... INFO @ Fri, 15 Feb 2019 06:21:31: #4 Write output xls file... SRX3782442.05_peaks.xls INFO @ Fri, 15 Feb 2019 06:21:32: #4 Write peak in narrowPeak format file... SRX3782442.05_peaks.narrowPeak INFO @ Fri, 15 Feb 2019 06:21:32: #4 Write summits bed file... SRX3782442.05_summits.bed INFO @ Fri, 15 Feb 2019 06:21:33: Done! pass1 - making usageList (144 chroms): 17 millis pass2 - checking and writing primary data (53019 records, 4 fields): 72 millis CompletedMACS2peakCalling INFO @ Fri, 15 Feb 2019 06:21:36: #4 Write output xls file... SRX3782442.10_peaks.xls INFO @ Fri, 15 Feb 2019 06:21:36: #4 Write peak in narrowPeak format file... SRX3782442.10_peaks.narrowPeak INFO @ Fri, 15 Feb 2019 06:21:36: #4 Write summits bed file... SRX3782442.10_summits.bed INFO @ Fri, 15 Feb 2019 06:21:36: Done! pass1 - making usageList (123 chroms): 16 millis pass2 - checking and writing primary data (39003 records, 4 fields): 67 millis CompletedMACS2peakCalling INFO @ Fri, 15 Feb 2019 06:21:37: #4 Write output xls file... SRX3782442.20_peaks.xls INFO @ Fri, 15 Feb 2019 06:21:38: #4 Write peak in narrowPeak format file... SRX3782442.20_peaks.narrowPeak INFO @ Fri, 15 Feb 2019 06:21:38: #4 Write summits bed file... SRX3782442.20_summits.bed INFO @ Fri, 15 Feb 2019 06:21:38: Done! pass1 - making usageList (91 chroms): 12 millis pass2 - checking and writing primary data (23440 records, 4 fields): 38 millis CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。