Job ID = 1294998 sra ファイルのダウンロード中... Read layout: PAIRED fastq に変換中... spots read : 47,335,037 reads read : 94,670,074 reads written : 94,670,074 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 01:06:37 47335037 reads; of these: 47335037 (100.00%) were paired; of these: 16360808 (34.56%) aligned concordantly 0 times 21413934 (45.24%) aligned concordantly exactly 1 time 9560295 (20.20%) aligned concordantly >1 times ---- 16360808 pairs aligned concordantly 0 times; of these: 267423 (1.63%) aligned discordantly 1 time ---- 16093385 pairs aligned 0 times concordantly or discordantly; of these: 32186770 mates make up the pairs; of these: 21717202 (67.47%) aligned 0 times 7145248 (22.20%) aligned exactly 1 time 3324320 (10.33%) aligned >1 times 77.06% overall alignment rate Time searching: 01:06:37 Overall time: 01:06:37 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 20 files... [bam_rmdup_core] processing reference chr2L... [bam_rmdup_core] processing reference chr2LHet... [bam_rmdup_core] processing reference chr2R... [bam_rmdup_core] processing reference chr2RHet... [bam_rmdup_core] processing reference chr3L... [bam_rmdup_core] processing reference chr3LHet... [bam_rmdup_core] processing reference chr3R... [bam_rmdup_core] processing reference chr3RHet... [bam_rmdup_core] processing reference chr4... [bam_rmdup_core] processing reference chrM... [bam_rmdup_core] processing reference chrU... [bam_rmdup_core] processing reference chrUextra... [bam_rmdup_core] processing reference chrX... [bam_rmdup_core] processing reference chrXHet... [bam_rmdup_core] processing reference chrYHet... [bam_rmdup_core] 8367680 / 30989752 = 0.2700 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 12:53:52: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 12:53:52: #1 read tag files... INFO @ Mon, 03 Jun 2019 12:53:52: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 12:53:52: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 12:53:52: #1 read tag files... INFO @ Mon, 03 Jun 2019 12:53:52: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 12:53:52: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 12:53:52: #1 read tag files... INFO @ Mon, 03 Jun 2019 12:53:52: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 12:53:58: 1000000 INFO @ Mon, 03 Jun 2019 12:53:59: 1000000 INFO @ Mon, 03 Jun 2019 12:53:59: 1000000 INFO @ Mon, 03 Jun 2019 12:54:04: 2000000 INFO @ Mon, 03 Jun 2019 12:54:05: 2000000 INFO @ Mon, 03 Jun 2019 12:54:05: 2000000 INFO @ Mon, 03 Jun 2019 12:54:10: 3000000 INFO @ Mon, 03 Jun 2019 12:54:12: 3000000 INFO @ Mon, 03 Jun 2019 12:54:12: 3000000 INFO @ Mon, 03 Jun 2019 12:54:16: 4000000 INFO @ Mon, 03 Jun 2019 12:54:19: 4000000 INFO @ Mon, 03 Jun 2019 12:54:19: 4000000 INFO @ Mon, 03 Jun 2019 12:54:21: 5000000 INFO @ Mon, 03 Jun 2019 12:54:26: 5000000 INFO @ Mon, 03 Jun 2019 12:54:26: 5000000 INFO @ Mon, 03 Jun 2019 12:54:27: 6000000 INFO @ Mon, 03 Jun 2019 12:54:32: 6000000 INFO @ Mon, 03 Jun 2019 12:54:32: 6000000 INFO @ Mon, 03 Jun 2019 12:54:33: 7000000 INFO @ Mon, 03 Jun 2019 12:54:39: 8000000 INFO @ Mon, 03 Jun 2019 12:54:39: 7000000 INFO @ Mon, 03 Jun 2019 12:54:39: 7000000 INFO @ Mon, 03 Jun 2019 12:54:45: 9000000 INFO @ Mon, 03 Jun 2019 12:54:46: 8000000 INFO @ Mon, 03 Jun 2019 12:54:46: 8000000 INFO @ Mon, 03 Jun 2019 12:54:50: 10000000 INFO @ Mon, 03 Jun 2019 12:54:52: 9000000 INFO @ Mon, 03 Jun 2019 12:54:52: 9000000 INFO @ Mon, 03 Jun 2019 12:54:56: 11000000 INFO @ Mon, 03 Jun 2019 12:54:59: 10000000 INFO @ Mon, 03 Jun 2019 12:54:59: 10000000 INFO @ Mon, 03 Jun 2019 12:55:02: 12000000 INFO @ Mon, 03 Jun 2019 12:55:06: 11000000 INFO @ Mon, 03 Jun 2019 12:55:06: 11000000 INFO @ Mon, 03 Jun 2019 12:55:08: 13000000 INFO @ Mon, 03 Jun 2019 12:55:12: 12000000 INFO @ Mon, 03 Jun 2019 12:55:12: 12000000 INFO @ Mon, 03 Jun 2019 12:55:14: 14000000 INFO @ Mon, 03 Jun 2019 12:55:19: 13000000 INFO @ Mon, 03 Jun 2019 12:55:19: 13000000 INFO @ Mon, 03 Jun 2019 12:55:19: 15000000 INFO @ Mon, 03 Jun 2019 12:55:25: 16000000 INFO @ Mon, 03 Jun 2019 12:55:26: 14000000 INFO @ Mon, 03 Jun 2019 12:55:26: 14000000 INFO @ Mon, 03 Jun 2019 12:55:31: 17000000 INFO @ Mon, 03 Jun 2019 12:55:32: 15000000 INFO @ Mon, 03 Jun 2019 12:55:32: 15000000 INFO @ Mon, 03 Jun 2019 12:55:37: 18000000 INFO @ Mon, 03 Jun 2019 12:55:39: 16000000 INFO @ Mon, 03 Jun 2019 12:55:39: 16000000 INFO @ Mon, 03 Jun 2019 12:55:42: 19000000 INFO @ Mon, 03 Jun 2019 12:55:45: 17000000 INFO @ Mon, 03 Jun 2019 12:55:46: 17000000 INFO @ Mon, 03 Jun 2019 12:55:48: 20000000 INFO @ Mon, 03 Jun 2019 12:55:52: 18000000 INFO @ Mon, 03 Jun 2019 12:55:52: 18000000 INFO @ Mon, 03 Jun 2019 12:55:54: 21000000 INFO @ Mon, 03 Jun 2019 12:55:58: 19000000 INFO @ Mon, 03 Jun 2019 12:55:59: 19000000 INFO @ Mon, 03 Jun 2019 12:56:00: 22000000 INFO @ Mon, 03 Jun 2019 12:56:05: 20000000 INFO @ Mon, 03 Jun 2019 12:56:05: 20000000 INFO @ Mon, 03 Jun 2019 12:56:06: 23000000 INFO @ Mon, 03 Jun 2019 12:56:11: 24000000 INFO @ Mon, 03 Jun 2019 12:56:11: 21000000 INFO @ Mon, 03 Jun 2019 12:56:12: 21000000 INFO @ Mon, 03 Jun 2019 12:56:17: 25000000 INFO @ Mon, 03 Jun 2019 12:56:18: 22000000 INFO @ Mon, 03 Jun 2019 12:56:18: 22000000 INFO @ Mon, 03 Jun 2019 12:56:23: 26000000 INFO @ Mon, 03 Jun 2019 12:56:24: 23000000 INFO @ Mon, 03 Jun 2019 12:56:25: 23000000 INFO @ Mon, 03 Jun 2019 12:56:28: 27000000 INFO @ Mon, 03 Jun 2019 12:56:30: 24000000 INFO @ Mon, 03 Jun 2019 12:56:31: 24000000 INFO @ Mon, 03 Jun 2019 12:56:34: 28000000 INFO @ Mon, 03 Jun 2019 12:56:37: 25000000 INFO @ Mon, 03 Jun 2019 12:56:37: 25000000 INFO @ Mon, 03 Jun 2019 12:56:39: 29000000 INFO @ Mon, 03 Jun 2019 12:56:43: 26000000 INFO @ Mon, 03 Jun 2019 12:56:43: 26000000 INFO @ Mon, 03 Jun 2019 12:56:45: 30000000 INFO @ Mon, 03 Jun 2019 12:56:49: 27000000 INFO @ Mon, 03 Jun 2019 12:56:50: 27000000 INFO @ Mon, 03 Jun 2019 12:56:51: 31000000 INFO @ Mon, 03 Jun 2019 12:56:56: 28000000 INFO @ Mon, 03 Jun 2019 12:56:56: 28000000 INFO @ Mon, 03 Jun 2019 12:56:57: 32000000 INFO @ Mon, 03 Jun 2019 12:57:02: 29000000 INFO @ Mon, 03 Jun 2019 12:57:02: 29000000 INFO @ Mon, 03 Jun 2019 12:57:02: 33000000 INFO @ Mon, 03 Jun 2019 12:57:08: 34000000 INFO @ Mon, 03 Jun 2019 12:57:08: 30000000 INFO @ Mon, 03 Jun 2019 12:57:08: 30000000 INFO @ Mon, 03 Jun 2019 12:57:13: 35000000 INFO @ Mon, 03 Jun 2019 12:57:15: 31000000 INFO @ Mon, 03 Jun 2019 12:57:15: 31000000 INFO @ Mon, 03 Jun 2019 12:57:19: 36000000 INFO @ Mon, 03 Jun 2019 12:57:21: 32000000 INFO @ Mon, 03 Jun 2019 12:57:22: 32000000 INFO @ Mon, 03 Jun 2019 12:57:25: 37000000 INFO @ Mon, 03 Jun 2019 12:57:27: 33000000 INFO @ Mon, 03 Jun 2019 12:57:28: 33000000 INFO @ Mon, 03 Jun 2019 12:57:30: 38000000 INFO @ Mon, 03 Jun 2019 12:57:34: 34000000 INFO @ Mon, 03 Jun 2019 12:57:34: 34000000 INFO @ Mon, 03 Jun 2019 12:57:36: 39000000 INFO @ Mon, 03 Jun 2019 12:57:40: 35000000 INFO @ Mon, 03 Jun 2019 12:57:40: 35000000 INFO @ Mon, 03 Jun 2019 12:57:42: 40000000 INFO @ Mon, 03 Jun 2019 12:57:46: 36000000 INFO @ Mon, 03 Jun 2019 12:57:47: 36000000 INFO @ Mon, 03 Jun 2019 12:57:47: 41000000 INFO @ Mon, 03 Jun 2019 12:57:52: 37000000 INFO @ Mon, 03 Jun 2019 12:57:53: 42000000 INFO @ Mon, 03 Jun 2019 12:57:53: 37000000 INFO @ Mon, 03 Jun 2019 12:57:58: 38000000 INFO @ Mon, 03 Jun 2019 12:57:59: 43000000 INFO @ Mon, 03 Jun 2019 12:58:00: 38000000 INFO @ Mon, 03 Jun 2019 12:58:04: 44000000 INFO @ Mon, 03 Jun 2019 12:58:04: 39000000 INFO @ Mon, 03 Jun 2019 12:58:06: 39000000 INFO @ Mon, 03 Jun 2019 12:58:10: 45000000 INFO @ Mon, 03 Jun 2019 12:58:10: 40000000 INFO @ Mon, 03 Jun 2019 12:58:12: 40000000 INFO @ Mon, 03 Jun 2019 12:58:15: 46000000 INFO @ Mon, 03 Jun 2019 12:58:17: 41000000 INFO @ Mon, 03 Jun 2019 12:58:18: 41000000 INFO @ Mon, 03 Jun 2019 12:58:21: 47000000 INFO @ Mon, 03 Jun 2019 12:58:23: 42000000 INFO @ Mon, 03 Jun 2019 12:58:25: 42000000 INFO @ Mon, 03 Jun 2019 12:58:27: 48000000 INFO @ Mon, 03 Jun 2019 12:58:29: 43000000 INFO @ Mon, 03 Jun 2019 12:58:31: 43000000 INFO @ Mon, 03 Jun 2019 12:58:32: 49000000 INFO @ Mon, 03 Jun 2019 12:58:35: 44000000 INFO @ Mon, 03 Jun 2019 12:58:37: 44000000 INFO @ Mon, 03 Jun 2019 12:58:38: 50000000 INFO @ Mon, 03 Jun 2019 12:58:41: 45000000 INFO @ Mon, 03 Jun 2019 12:58:44: 51000000 INFO @ Mon, 03 Jun 2019 12:58:44: 45000000 INFO @ Mon, 03 Jun 2019 12:58:47: 46000000 INFO @ Mon, 03 Jun 2019 12:58:49: 52000000 INFO @ Mon, 03 Jun 2019 12:58:50: 46000000 INFO @ Mon, 03 Jun 2019 12:58:53: 47000000 INFO @ Mon, 03 Jun 2019 12:58:55: 53000000 INFO @ Mon, 03 Jun 2019 12:58:56: 47000000 INFO @ Mon, 03 Jun 2019 12:59:00: 48000000 INFO @ Mon, 03 Jun 2019 12:59:00: 54000000 INFO @ Mon, 03 Jun 2019 12:59:03: 48000000 INFO @ Mon, 03 Jun 2019 12:59:06: 55000000 INFO @ Mon, 03 Jun 2019 12:59:06: 49000000 INFO @ Mon, 03 Jun 2019 12:59:09: 49000000 INFO @ Mon, 03 Jun 2019 12:59:12: 56000000 INFO @ Mon, 03 Jun 2019 12:59:12: 50000000 INFO @ Mon, 03 Jun 2019 12:59:13: #1 tag size is determined as 24 bps INFO @ Mon, 03 Jun 2019 12:59:13: #1 tag size = 24 INFO @ Mon, 03 Jun 2019 12:59:13: #1 total tags in treatment: 22614716 INFO @ Mon, 03 Jun 2019 12:59:13: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 12:59:13: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 12:59:14: #1 tags after filtering in treatment: 15567045 INFO @ Mon, 03 Jun 2019 12:59:14: #1 Redundant rate of treatment: 0.31 INFO @ Mon, 03 Jun 2019 12:59:14: #1 finished! INFO @ Mon, 03 Jun 2019 12:59:14: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 12:59:14: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 12:59:15: #2 number of paired peaks: 218 WARNING @ Mon, 03 Jun 2019 12:59:15: Fewer paired peaks (218) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 218 pairs to build model! INFO @ Mon, 03 Jun 2019 12:59:15: start model_add_line... INFO @ Mon, 03 Jun 2019 12:59:15: start X-correlation... INFO @ Mon, 03 Jun 2019 12:59:15: end of X-cor INFO @ Mon, 03 Jun 2019 12:59:15: #2 finished! INFO @ Mon, 03 Jun 2019 12:59:15: #2 predicted fragment length is 73 bps INFO @ Mon, 03 Jun 2019 12:59:15: #2 alternative fragment length(s) may be 73 bps INFO @ Mon, 03 Jun 2019 12:59:15: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.10_model.r INFO @ Mon, 03 Jun 2019 12:59:15: #3 Call peaks... INFO @ Mon, 03 Jun 2019 12:59:15: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 12:59:16: 50000000 INFO @ Mon, 03 Jun 2019 12:59:19: 51000000 INFO @ Mon, 03 Jun 2019 12:59:22: 51000000 INFO @ Mon, 03 Jun 2019 12:59:25: 52000000 INFO @ Mon, 03 Jun 2019 12:59:28: 52000000 INFO @ Mon, 03 Jun 2019 12:59:31: 53000000 INFO @ Mon, 03 Jun 2019 12:59:34: 53000000 INFO @ Mon, 03 Jun 2019 12:59:37: 54000000 INFO @ Mon, 03 Jun 2019 12:59:41: 54000000 INFO @ Mon, 03 Jun 2019 12:59:43: 55000000 INFO @ Mon, 03 Jun 2019 12:59:47: 55000000 INFO @ Mon, 03 Jun 2019 12:59:49: 56000000 INFO @ Mon, 03 Jun 2019 12:59:51: #1 tag size is determined as 24 bps INFO @ Mon, 03 Jun 2019 12:59:51: #1 tag size = 24 INFO @ Mon, 03 Jun 2019 12:59:51: #1 total tags in treatment: 22614716 INFO @ Mon, 03 Jun 2019 12:59:51: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 12:59:51: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 12:59:51: #1 tags after filtering in treatment: 15567045 INFO @ Mon, 03 Jun 2019 12:59:51: #1 Redundant rate of treatment: 0.31 INFO @ Mon, 03 Jun 2019 12:59:51: #1 finished! INFO @ Mon, 03 Jun 2019 12:59:51: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 12:59:51: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 12:59:53: #2 number of paired peaks: 218 WARNING @ Mon, 03 Jun 2019 12:59:53: Fewer paired peaks (218) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 218 pairs to build model! INFO @ Mon, 03 Jun 2019 12:59:53: start model_add_line... INFO @ Mon, 03 Jun 2019 12:59:53: start X-correlation... INFO @ Mon, 03 Jun 2019 12:59:53: end of X-cor INFO @ Mon, 03 Jun 2019 12:59:53: #2 finished! INFO @ Mon, 03 Jun 2019 12:59:53: #2 predicted fragment length is 73 bps INFO @ Mon, 03 Jun 2019 12:59:53: #2 alternative fragment length(s) may be 73 bps INFO @ Mon, 03 Jun 2019 12:59:53: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.05_model.r INFO @ Mon, 03 Jun 2019 12:59:53: #3 Call peaks... INFO @ Mon, 03 Jun 2019 12:59:53: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 12:59:53: 56000000 INFO @ Mon, 03 Jun 2019 12:59:54: #1 tag size is determined as 24 bps INFO @ Mon, 03 Jun 2019 12:59:54: #1 tag size = 24 INFO @ Mon, 03 Jun 2019 12:59:54: #1 total tags in treatment: 22614716 INFO @ Mon, 03 Jun 2019 12:59:54: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 12:59:54: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 12:59:55: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 12:59:55: #1 tags after filtering in treatment: 15567045 INFO @ Mon, 03 Jun 2019 12:59:55: #1 Redundant rate of treatment: 0.31 INFO @ Mon, 03 Jun 2019 12:59:55: #1 finished! INFO @ Mon, 03 Jun 2019 12:59:55: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 12:59:55: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 12:59:56: #2 number of paired peaks: 218 WARNING @ Mon, 03 Jun 2019 12:59:56: Fewer paired peaks (218) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 218 pairs to build model! INFO @ Mon, 03 Jun 2019 12:59:56: start model_add_line... INFO @ Mon, 03 Jun 2019 12:59:56: start X-correlation... INFO @ Mon, 03 Jun 2019 12:59:56: end of X-cor INFO @ Mon, 03 Jun 2019 12:59:56: #2 finished! INFO @ Mon, 03 Jun 2019 12:59:56: #2 predicted fragment length is 73 bps INFO @ Mon, 03 Jun 2019 12:59:56: #2 alternative fragment length(s) may be 73 bps INFO @ Mon, 03 Jun 2019 12:59:56: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.20_model.r INFO @ Mon, 03 Jun 2019 12:59:56: #3 Call peaks... INFO @ Mon, 03 Jun 2019 12:59:56: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 13:00:15: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.10_peaks.xls INFO @ Mon, 03 Jun 2019 13:00:15: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.10_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 13:00:15: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.10_summits.bed INFO @ Mon, 03 Jun 2019 13:00:15: Done! pass1 - making usageList (13 chroms): 2 millis pass2 - checking and writing primary data (1750 records, 4 fields): 8 millis CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 13:00:33: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 13:00:37: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 13:00:53: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.05_peaks.xls INFO @ Mon, 03 Jun 2019 13:00:53: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.05_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 13:00:53: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.05_summits.bed INFO @ Mon, 03 Jun 2019 13:00:53: Done! pass1 - making usageList (15 chroms): 2 millis pass2 - checking and writing primary data (4063 records, 4 fields): 6 millis CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 13:00:57: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.20_peaks.xls INFO @ Mon, 03 Jun 2019 13:00:57: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.20_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 13:00:57: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX300932/SRX300932.20_summits.bed INFO @ Mon, 03 Jun 2019 13:00:57: Done! pass1 - making usageList (13 chroms): 1 millis pass2 - checking and writing primary data (896 records, 4 fields): 3 millis CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。