Job ID = 1293629 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-02T16:08:18 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T16:08:18 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T16:08:18 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 50,712,068 reads read : 50,712,068 reads written : 50,712,068 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:20:35 50712068 reads; of these: 50712068 (100.00%) were unpaired; of these: 2294954 (4.53%) aligned 0 times 34947851 (68.91%) aligned exactly 1 time 13469263 (26.56%) aligned >1 times 95.47% overall alignment rate Time searching: 00:20:35 Overall time: 00:20:35 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 20 files... [bam_rmdupse_core] 20408812 / 48417114 = 0.4215 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 01:57:34: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 01:57:34: #1 read tag files... INFO @ Mon, 03 Jun 2019 01:57:34: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 01:57:34: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 01:57:34: #1 read tag files... INFO @ Mon, 03 Jun 2019 01:57:34: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 01:57:34: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 01:57:34: #1 read tag files... INFO @ Mon, 03 Jun 2019 01:57:34: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 01:57:42: 1000000 INFO @ Mon, 03 Jun 2019 01:57:42: 1000000 INFO @ Mon, 03 Jun 2019 01:57:43: 1000000 INFO @ Mon, 03 Jun 2019 01:57:50: 2000000 INFO @ Mon, 03 Jun 2019 01:57:50: 2000000 INFO @ Mon, 03 Jun 2019 01:57:53: 2000000 INFO @ Mon, 03 Jun 2019 01:57:57: 3000000 INFO @ Mon, 03 Jun 2019 01:57:57: 3000000 INFO @ Mon, 03 Jun 2019 01:58:01: 3000000 INFO @ Mon, 03 Jun 2019 01:58:04: 4000000 INFO @ Mon, 03 Jun 2019 01:58:04: 4000000 INFO @ Mon, 03 Jun 2019 01:58:10: 4000000 INFO @ Mon, 03 Jun 2019 01:58:10: 5000000 INFO @ Mon, 03 Jun 2019 01:58:12: 5000000 INFO @ Mon, 03 Jun 2019 01:58:17: 6000000 INFO @ Mon, 03 Jun 2019 01:58:19: 5000000 INFO @ Mon, 03 Jun 2019 01:58:19: 6000000 INFO @ Mon, 03 Jun 2019 01:58:24: 7000000 INFO @ Mon, 03 Jun 2019 01:58:26: 7000000 INFO @ Mon, 03 Jun 2019 01:58:27: 6000000 INFO @ Mon, 03 Jun 2019 01:58:31: 8000000 INFO @ Mon, 03 Jun 2019 01:58:34: 8000000 INFO @ Mon, 03 Jun 2019 01:58:36: 7000000 INFO @ Mon, 03 Jun 2019 01:58:38: 9000000 INFO @ Mon, 03 Jun 2019 01:58:41: 9000000 INFO @ Mon, 03 Jun 2019 01:58:44: 8000000 INFO @ Mon, 03 Jun 2019 01:58:45: 10000000 INFO @ Mon, 03 Jun 2019 01:58:49: 10000000 INFO @ Mon, 03 Jun 2019 01:58:52: 11000000 INFO @ Mon, 03 Jun 2019 01:58:53: 9000000 INFO @ Mon, 03 Jun 2019 01:58:56: 11000000 INFO @ Mon, 03 Jun 2019 01:58:59: 12000000 INFO @ Mon, 03 Jun 2019 01:59:02: 10000000 INFO @ Mon, 03 Jun 2019 01:59:04: 12000000 INFO @ Mon, 03 Jun 2019 01:59:06: 13000000 INFO @ Mon, 03 Jun 2019 01:59:11: 13000000 INFO @ Mon, 03 Jun 2019 01:59:12: 11000000 INFO @ Mon, 03 Jun 2019 01:59:13: 14000000 INFO @ Mon, 03 Jun 2019 01:59:19: 14000000 INFO @ Mon, 03 Jun 2019 01:59:21: 12000000 INFO @ Mon, 03 Jun 2019 01:59:22: 15000000 INFO @ Mon, 03 Jun 2019 01:59:26: 15000000 INFO @ Mon, 03 Jun 2019 01:59:29: 13000000 INFO @ Mon, 03 Jun 2019 01:59:31: 16000000 INFO @ Mon, 03 Jun 2019 01:59:33: 16000000 INFO @ Mon, 03 Jun 2019 01:59:38: 14000000 INFO @ Mon, 03 Jun 2019 01:59:40: 17000000 INFO @ Mon, 03 Jun 2019 01:59:41: 17000000 INFO @ Mon, 03 Jun 2019 01:59:47: 15000000 INFO @ Mon, 03 Jun 2019 01:59:48: 18000000 INFO @ Mon, 03 Jun 2019 01:59:48: 18000000 INFO @ Mon, 03 Jun 2019 01:59:55: 19000000 INFO @ Mon, 03 Jun 2019 01:59:56: 16000000 INFO @ Mon, 03 Jun 2019 01:59:57: 19000000 INFO @ Mon, 03 Jun 2019 02:00:01: 20000000 INFO @ Mon, 03 Jun 2019 02:00:05: 20000000 INFO @ Mon, 03 Jun 2019 02:00:06: 17000000 INFO @ Mon, 03 Jun 2019 02:00:09: 21000000 INFO @ Mon, 03 Jun 2019 02:00:14: 21000000 INFO @ Mon, 03 Jun 2019 02:00:15: 18000000 INFO @ Mon, 03 Jun 2019 02:00:17: 22000000 INFO @ Mon, 03 Jun 2019 02:00:22: 22000000 INFO @ Mon, 03 Jun 2019 02:00:23: 19000000 INFO @ Mon, 03 Jun 2019 02:00:26: 23000000 INFO @ Mon, 03 Jun 2019 02:00:31: 23000000 INFO @ Mon, 03 Jun 2019 02:00:32: 20000000 INFO @ Mon, 03 Jun 2019 02:00:34: 24000000 INFO @ Mon, 03 Jun 2019 02:00:40: 24000000 INFO @ Mon, 03 Jun 2019 02:00:41: 21000000 INFO @ Mon, 03 Jun 2019 02:00:42: 25000000 INFO @ Mon, 03 Jun 2019 02:00:49: 22000000 INFO @ Mon, 03 Jun 2019 02:00:50: 25000000 INFO @ Mon, 03 Jun 2019 02:00:50: 26000000 INFO @ Mon, 03 Jun 2019 02:00:58: 27000000 INFO @ Mon, 03 Jun 2019 02:00:59: 23000000 INFO @ Mon, 03 Jun 2019 02:01:00: 26000000 INFO @ Mon, 03 Jun 2019 02:01:07: 28000000 INFO @ Mon, 03 Jun 2019 02:01:07: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 02:01:07: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 02:01:07: #1 total tags in treatment: 28008302 INFO @ Mon, 03 Jun 2019 02:01:07: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 02:01:07: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 02:01:08: #1 tags after filtering in treatment: 28008302 INFO @ Mon, 03 Jun 2019 02:01:08: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 02:01:08: #1 finished! INFO @ Mon, 03 Jun 2019 02:01:08: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 02:01:08: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 02:01:08: 24000000 INFO @ Mon, 03 Jun 2019 02:01:09: 27000000 INFO @ Mon, 03 Jun 2019 02:01:10: #2 number of paired peaks: 178 WARNING @ Mon, 03 Jun 2019 02:01:10: Fewer paired peaks (178) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 178 pairs to build model! INFO @ Mon, 03 Jun 2019 02:01:10: start model_add_line... INFO @ Mon, 03 Jun 2019 02:01:10: start X-correlation... INFO @ Mon, 03 Jun 2019 02:01:10: end of X-cor INFO @ Mon, 03 Jun 2019 02:01:10: #2 finished! INFO @ Mon, 03 Jun 2019 02:01:10: #2 predicted fragment length is 130 bps INFO @ Mon, 03 Jun 2019 02:01:10: #2 alternative fragment length(s) may be 130 bps INFO @ Mon, 03 Jun 2019 02:01:10: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.20_model.r INFO @ Mon, 03 Jun 2019 02:01:10: #3 Call peaks... INFO @ Mon, 03 Jun 2019 02:01:10: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 02:01:17: 25000000 INFO @ Mon, 03 Jun 2019 02:01:18: 28000000 INFO @ Mon, 03 Jun 2019 02:01:18: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 02:01:18: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 02:01:18: #1 total tags in treatment: 28008302 INFO @ Mon, 03 Jun 2019 02:01:18: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 02:01:18: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 02:01:19: #1 tags after filtering in treatment: 28008302 INFO @ Mon, 03 Jun 2019 02:01:19: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 02:01:19: #1 finished! INFO @ Mon, 03 Jun 2019 02:01:19: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 02:01:19: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 02:01:21: #2 number of paired peaks: 178 WARNING @ Mon, 03 Jun 2019 02:01:21: Fewer paired peaks (178) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 178 pairs to build model! INFO @ Mon, 03 Jun 2019 02:01:21: start model_add_line... INFO @ Mon, 03 Jun 2019 02:01:21: start X-correlation... INFO @ Mon, 03 Jun 2019 02:01:21: end of X-cor INFO @ Mon, 03 Jun 2019 02:01:21: #2 finished! INFO @ Mon, 03 Jun 2019 02:01:21: #2 predicted fragment length is 130 bps INFO @ Mon, 03 Jun 2019 02:01:21: #2 alternative fragment length(s) may be 130 bps INFO @ Mon, 03 Jun 2019 02:01:21: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.10_model.r INFO @ Mon, 03 Jun 2019 02:01:21: #3 Call peaks... INFO @ Mon, 03 Jun 2019 02:01:21: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 02:01:26: 26000000 INFO @ Mon, 03 Jun 2019 02:01:35: 27000000 INFO @ Mon, 03 Jun 2019 02:01:44: 28000000 INFO @ Mon, 03 Jun 2019 02:01:44: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 02:01:44: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 02:01:44: #1 total tags in treatment: 28008302 INFO @ Mon, 03 Jun 2019 02:01:44: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 02:01:44: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 02:01:45: #1 tags after filtering in treatment: 28008302 INFO @ Mon, 03 Jun 2019 02:01:45: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 02:01:45: #1 finished! INFO @ Mon, 03 Jun 2019 02:01:45: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 02:01:45: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 02:01:47: #2 number of paired peaks: 178 WARNING @ Mon, 03 Jun 2019 02:01:47: Fewer paired peaks (178) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 178 pairs to build model! INFO @ Mon, 03 Jun 2019 02:01:47: start model_add_line... INFO @ Mon, 03 Jun 2019 02:01:48: start X-correlation... INFO @ Mon, 03 Jun 2019 02:01:48: end of X-cor INFO @ Mon, 03 Jun 2019 02:01:48: #2 finished! INFO @ Mon, 03 Jun 2019 02:01:48: #2 predicted fragment length is 130 bps INFO @ Mon, 03 Jun 2019 02:01:48: #2 alternative fragment length(s) may be 130 bps INFO @ Mon, 03 Jun 2019 02:01:48: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.05_model.r INFO @ Mon, 03 Jun 2019 02:01:48: #3 Call peaks... INFO @ Mon, 03 Jun 2019 02:01:48: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 02:02:19: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 02:02:30: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 02:02:50: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.20_peaks.xls INFO @ Mon, 03 Jun 2019 02:02:50: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.20_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 02:02:50: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.20_summits.bed INFO @ Mon, 03 Jun 2019 02:02:50: Done! pass1 - making usageList (14 chroms): 1 millis pass2 - checking and writing primary data (5179 records, 4 fields): 10 millis CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 02:02:56: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 02:03:02: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.10_peaks.xls INFO @ Mon, 03 Jun 2019 02:03:02: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.10_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 02:03:02: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.10_summits.bed INFO @ Mon, 03 Jun 2019 02:03:02: Done! pass1 - making usageList (14 chroms): 4 millis pass2 - checking and writing primary data (10678 records, 4 fields): 19 millis CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 02:03:27: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.05_peaks.xls INFO @ Mon, 03 Jun 2019 02:03:28: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.05_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 02:03:28: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX1032404/SRX1032404.05_summits.bed INFO @ Mon, 03 Jun 2019 02:03:28: Done! pass1 - making usageList (14 chroms): 8 millis pass2 - checking and writing primary data (21064 records, 4 fields): 27 millis CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。