Job ID = 1299219 sra ファイルのダウンロード中... Read layout: PAIRED fastq に変換中... spots read : 20,399,687 reads read : 40,799,374 reads written : 40,799,374 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 01:08:42 20399687 reads; of these: 20399687 (100.00%) were paired; of these: 786807 (3.86%) aligned concordantly 0 times 15077826 (73.91%) aligned concordantly exactly 1 time 4535054 (22.23%) aligned concordantly >1 times ---- 786807 pairs aligned concordantly 0 times; of these: 83861 (10.66%) aligned discordantly 1 time ---- 702946 pairs aligned 0 times concordantly or discordantly; of these: 1405892 mates make up the pairs; of these: 920504 (65.47%) aligned 0 times 356420 (25.35%) aligned exactly 1 time 128968 (9.17%) aligned >1 times 97.74% overall alignment rate Time searching: 01:08:42 Overall time: 01:08:42 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 20 files... [bam_rmdup_core] processing reference chr2L... [bam_rmdup_core] processing reference chr2LHet... [bam_rmdup_core] processing reference chr2R... [bam_rmdup_core] processing reference chr2RHet... [bam_rmdup_core] processing reference chr3L... [bam_rmdup_core] processing reference chr3LHet... [bam_rmdup_core] processing reference chr3R... [bam_rmdup_core] processing reference chr3RHet... [bam_rmdup_core] processing reference chr4... [bam_rmdup_core] processing reference chrM... [bam_rmdup_core] processing reference chrU... [bam_rmdup_core] processing reference chrUextra... [bam_rmdup_core] processing reference chrX... [bam_rmdup_core] processing reference chrXHet... [bam_rmdup_core] processing reference chrYHet... [bam_rmdup_core] 1226184 / 19669603 = 0.0623 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 19:48:44: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 19:48:44: #1 read tag files... INFO @ Mon, 03 Jun 2019 19:48:44: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 19:48:44: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 19:48:44: #1 read tag files... INFO @ Mon, 03 Jun 2019 19:48:44: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 19:48:44: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 19:48:44: #1 read tag files... INFO @ Mon, 03 Jun 2019 19:48:44: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 19:48:59: 1000000 INFO @ Mon, 03 Jun 2019 19:49:00: 1000000 INFO @ Mon, 03 Jun 2019 19:49:00: 1000000 INFO @ Mon, 03 Jun 2019 19:49:14: 2000000 INFO @ Mon, 03 Jun 2019 19:49:15: 2000000 INFO @ Mon, 03 Jun 2019 19:49:16: 2000000 INFO @ Mon, 03 Jun 2019 19:49:29: 3000000 INFO @ Mon, 03 Jun 2019 19:49:32: 3000000 INFO @ Mon, 03 Jun 2019 19:49:32: 3000000 INFO @ Mon, 03 Jun 2019 19:49:44: 4000000 INFO @ Mon, 03 Jun 2019 19:49:48: 4000000 INFO @ Mon, 03 Jun 2019 19:49:48: 4000000 INFO @ Mon, 03 Jun 2019 19:49:59: 5000000 INFO @ Mon, 03 Jun 2019 19:50:05: 5000000 INFO @ Mon, 03 Jun 2019 19:50:05: 5000000 INFO @ Mon, 03 Jun 2019 19:50:14: 6000000 INFO @ Mon, 03 Jun 2019 19:50:21: 6000000 INFO @ Mon, 03 Jun 2019 19:50:21: 6000000 INFO @ Mon, 03 Jun 2019 19:50:28: 7000000 INFO @ Mon, 03 Jun 2019 19:50:36: 7000000 INFO @ Mon, 03 Jun 2019 19:50:37: 7000000 INFO @ Mon, 03 Jun 2019 19:50:42: 8000000 INFO @ Mon, 03 Jun 2019 19:50:53: 8000000 INFO @ Mon, 03 Jun 2019 19:50:53: 8000000 INFO @ Mon, 03 Jun 2019 19:50:57: 9000000 INFO @ Mon, 03 Jun 2019 19:51:09: 9000000 INFO @ Mon, 03 Jun 2019 19:51:09: 9000000 INFO @ Mon, 03 Jun 2019 19:51:11: 10000000 INFO @ Mon, 03 Jun 2019 19:51:26: 11000000 INFO @ Mon, 03 Jun 2019 19:51:26: 10000000 INFO @ Mon, 03 Jun 2019 19:51:27: 10000000 INFO @ Mon, 03 Jun 2019 19:51:40: 12000000 INFO @ Mon, 03 Jun 2019 19:51:42: 11000000 INFO @ Mon, 03 Jun 2019 19:51:42: 11000000 INFO @ Mon, 03 Jun 2019 19:51:53: 13000000 INFO @ Mon, 03 Jun 2019 19:51:57: 12000000 INFO @ Mon, 03 Jun 2019 19:51:58: 12000000 INFO @ Mon, 03 Jun 2019 19:52:07: 14000000 INFO @ Mon, 03 Jun 2019 19:52:12: 13000000 INFO @ Mon, 03 Jun 2019 19:52:13: 13000000 INFO @ Mon, 03 Jun 2019 19:52:20: 15000000 INFO @ Mon, 03 Jun 2019 19:52:27: 14000000 INFO @ Mon, 03 Jun 2019 19:52:29: 14000000 INFO @ Mon, 03 Jun 2019 19:52:35: 16000000 INFO @ Mon, 03 Jun 2019 19:52:43: 15000000 INFO @ Mon, 03 Jun 2019 19:52:45: 15000000 INFO @ Mon, 03 Jun 2019 19:52:49: 17000000 INFO @ Mon, 03 Jun 2019 19:52:58: 16000000 INFO @ Mon, 03 Jun 2019 19:53:00: 16000000 INFO @ Mon, 03 Jun 2019 19:53:02: 18000000 INFO @ Mon, 03 Jun 2019 19:53:13: 17000000 INFO @ Mon, 03 Jun 2019 19:53:17: 17000000 INFO @ Mon, 03 Jun 2019 19:53:17: 19000000 INFO @ Mon, 03 Jun 2019 19:53:28: 18000000 INFO @ Mon, 03 Jun 2019 19:53:30: 20000000 INFO @ Mon, 03 Jun 2019 19:53:31: 18000000 INFO @ Mon, 03 Jun 2019 19:53:42: 19000000 INFO @ Mon, 03 Jun 2019 19:53:43: 21000000 INFO @ Mon, 03 Jun 2019 19:53:46: 19000000 INFO @ Mon, 03 Jun 2019 19:53:57: 22000000 INFO @ Mon, 03 Jun 2019 19:53:57: 20000000 INFO @ Mon, 03 Jun 2019 19:54:01: 20000000 INFO @ Mon, 03 Jun 2019 19:54:10: 23000000 INFO @ Mon, 03 Jun 2019 19:54:12: 21000000 INFO @ Mon, 03 Jun 2019 19:54:16: 21000000 INFO @ Mon, 03 Jun 2019 19:54:24: 24000000 INFO @ Mon, 03 Jun 2019 19:54:28: 22000000 INFO @ Mon, 03 Jun 2019 19:54:31: 22000000 INFO @ Mon, 03 Jun 2019 19:54:38: 25000000 INFO @ Mon, 03 Jun 2019 19:54:42: 23000000 INFO @ Mon, 03 Jun 2019 19:54:46: 23000000 INFO @ Mon, 03 Jun 2019 19:54:51: 26000000 INFO @ Mon, 03 Jun 2019 19:54:57: 24000000 INFO @ Mon, 03 Jun 2019 19:55:01: 24000000 INFO @ Mon, 03 Jun 2019 19:55:04: 27000000 INFO @ Mon, 03 Jun 2019 19:55:12: 25000000 INFO @ Mon, 03 Jun 2019 19:55:16: 25000000 INFO @ Mon, 03 Jun 2019 19:55:19: 28000000 INFO @ Mon, 03 Jun 2019 19:55:27: 26000000 INFO @ Mon, 03 Jun 2019 19:55:31: 26000000 INFO @ Mon, 03 Jun 2019 19:55:33: 29000000 INFO @ Mon, 03 Jun 2019 19:55:42: 27000000 INFO @ Mon, 03 Jun 2019 19:55:46: 27000000 INFO @ Mon, 03 Jun 2019 19:55:47: 30000000 INFO @ Mon, 03 Jun 2019 19:55:57: 28000000 INFO @ Mon, 03 Jun 2019 19:56:01: 28000000 INFO @ Mon, 03 Jun 2019 19:56:01: 31000000 INFO @ Mon, 03 Jun 2019 19:56:13: 29000000 INFO @ Mon, 03 Jun 2019 19:56:15: 32000000 INFO @ Mon, 03 Jun 2019 19:56:17: 29000000 INFO @ Mon, 03 Jun 2019 19:56:28: 30000000 INFO @ Mon, 03 Jun 2019 19:56:28: 33000000 INFO @ Mon, 03 Jun 2019 19:56:32: 30000000 INFO @ Mon, 03 Jun 2019 19:56:42: 34000000 INFO @ Mon, 03 Jun 2019 19:56:43: 31000000 INFO @ Mon, 03 Jun 2019 19:56:48: 31000000 INFO @ Mon, 03 Jun 2019 19:56:56: 35000000 INFO @ Mon, 03 Jun 2019 19:56:59: 32000000 INFO @ Mon, 03 Jun 2019 19:57:03: 32000000 INFO @ Mon, 03 Jun 2019 19:57:10: 36000000 INFO @ Mon, 03 Jun 2019 19:57:13: 33000000 INFO @ Mon, 03 Jun 2019 19:57:18: 33000000 INFO @ Mon, 03 Jun 2019 19:57:23: 37000000 INFO @ Mon, 03 Jun 2019 19:57:29: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 19:57:29: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 19:57:29: #1 total tags in treatment: 18388054 INFO @ Mon, 03 Jun 2019 19:57:29: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 19:57:29: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 19:57:29: 34000000 INFO @ Mon, 03 Jun 2019 19:57:29: #1 tags after filtering in treatment: 15833572 INFO @ Mon, 03 Jun 2019 19:57:29: #1 Redundant rate of treatment: 0.14 INFO @ Mon, 03 Jun 2019 19:57:29: #1 finished! INFO @ Mon, 03 Jun 2019 19:57:29: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 19:57:29: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 19:57:31: #2 number of paired peaks: 14 WARNING @ Mon, 03 Jun 2019 19:57:31: Too few paired peaks (14) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 19:57:31: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 19:57:32: 34000000 INFO @ Mon, 03 Jun 2019 19:57:43: 35000000 INFO @ Mon, 03 Jun 2019 19:57:46: 35000000 INFO @ Mon, 03 Jun 2019 19:57:58: 36000000 INFO @ Mon, 03 Jun 2019 19:58:01: 36000000 INFO @ Mon, 03 Jun 2019 19:58:13: 37000000 INFO @ Mon, 03 Jun 2019 19:58:16: 37000000 INFO @ Mon, 03 Jun 2019 19:58:20: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 19:58:20: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 19:58:20: #1 total tags in treatment: 18388054 INFO @ Mon, 03 Jun 2019 19:58:20: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 19:58:20: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 19:58:20: #1 tags after filtering in treatment: 15833572 INFO @ Mon, 03 Jun 2019 19:58:20: #1 Redundant rate of treatment: 0.14 INFO @ Mon, 03 Jun 2019 19:58:20: #1 finished! INFO @ Mon, 03 Jun 2019 19:58:20: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 19:58:20: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 19:58:22: #2 number of paired peaks: 14 WARNING @ Mon, 03 Jun 2019 19:58:22: Too few paired peaks (14) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 19:58:22: Process for pairing-model is terminated! INFO @ Mon, 03 Jun 2019 19:58:22: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 19:58:22: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 19:58:22: #1 total tags in treatment: 18388054 INFO @ Mon, 03 Jun 2019 19:58:22: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 19:58:22: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) cut: /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 8 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 19:58:23: #1 tags after filtering in treatment: 15833572 INFO @ Mon, 03 Jun 2019 19:58:23: #1 Redundant rate of treatment: 0.14 INFO @ Mon, 03 Jun 2019 19:58:23: #1 finished! INFO @ Mon, 03 Jun 2019 19:58:23: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 19:58:23: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 19:58:24: #2 number of paired peaks: 14 WARNING @ Mon, 03 Jun 2019 19:58:24: Too few paired peaks (14) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 19:58:24: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX474568/SRX474568.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。