Job ID = 1294596 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-03T00:11:13 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:12:45 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:12:45 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:12:45 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:12:45 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:14:17 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:16:46 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:16:46 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:18:52 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T00:20:01 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 20,507,609 reads read : 20,507,609 reads written : 20,507,609 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:06:11 20507609 reads; of these: 20507609 (100.00%) were unpaired; of these: 1275594 (6.22%) aligned 0 times 16370211 (79.83%) aligned exactly 1 time 2861804 (13.95%) aligned >1 times 93.78% overall alignment rate Time searching: 00:06:11 Overall time: 00:06:11 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 8 files... [bam_rmdupse_core] 2448263 / 19232015 = 0.1273 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 09:33:01: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 09:33:01: #1 read tag files... INFO @ Mon, 03 Jun 2019 09:33:01: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 09:33:01: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 09:33:01: #1 read tag files... INFO @ Mon, 03 Jun 2019 09:33:01: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 09:33:01: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 09:33:01: #1 read tag files... INFO @ Mon, 03 Jun 2019 09:33:01: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 09:33:11: 1000000 INFO @ Mon, 03 Jun 2019 09:33:11: 1000000 INFO @ Mon, 03 Jun 2019 09:33:12: 1000000 INFO @ Mon, 03 Jun 2019 09:33:21: 2000000 INFO @ Mon, 03 Jun 2019 09:33:21: 2000000 INFO @ Mon, 03 Jun 2019 09:33:22: 2000000 INFO @ Mon, 03 Jun 2019 09:33:31: 3000000 INFO @ Mon, 03 Jun 2019 09:33:31: 3000000 INFO @ Mon, 03 Jun 2019 09:33:32: 3000000 INFO @ Mon, 03 Jun 2019 09:33:40: 4000000 INFO @ Mon, 03 Jun 2019 09:33:41: 4000000 INFO @ Mon, 03 Jun 2019 09:33:42: 4000000 INFO @ Mon, 03 Jun 2019 09:33:50: 5000000 INFO @ Mon, 03 Jun 2019 09:33:51: 5000000 INFO @ Mon, 03 Jun 2019 09:33:52: 5000000 INFO @ Mon, 03 Jun 2019 09:34:00: 6000000 INFO @ Mon, 03 Jun 2019 09:34:00: 6000000 INFO @ Mon, 03 Jun 2019 09:34:01: 6000000 INFO @ Mon, 03 Jun 2019 09:34:09: 7000000 INFO @ Mon, 03 Jun 2019 09:34:09: 7000000 INFO @ Mon, 03 Jun 2019 09:34:10: 7000000 INFO @ Mon, 03 Jun 2019 09:34:17: 8000000 INFO @ Mon, 03 Jun 2019 09:34:19: 8000000 INFO @ Mon, 03 Jun 2019 09:34:20: 8000000 INFO @ Mon, 03 Jun 2019 09:34:26: 9000000 INFO @ Mon, 03 Jun 2019 09:34:29: 9000000 INFO @ Mon, 03 Jun 2019 09:34:30: 9000000 INFO @ Mon, 03 Jun 2019 09:34:34: 10000000 INFO @ Mon, 03 Jun 2019 09:34:38: 10000000 INFO @ Mon, 03 Jun 2019 09:34:39: 10000000 INFO @ Mon, 03 Jun 2019 09:34:43: 11000000 INFO @ Mon, 03 Jun 2019 09:34:48: 11000000 INFO @ Mon, 03 Jun 2019 09:34:49: 11000000 INFO @ Mon, 03 Jun 2019 09:34:51: 12000000 INFO @ Mon, 03 Jun 2019 09:34:58: 12000000 INFO @ Mon, 03 Jun 2019 09:34:59: 12000000 INFO @ Mon, 03 Jun 2019 09:35:00: 13000000 INFO @ Mon, 03 Jun 2019 09:35:07: 13000000 INFO @ Mon, 03 Jun 2019 09:35:08: 14000000 INFO @ Mon, 03 Jun 2019 09:35:08: 13000000 INFO @ Mon, 03 Jun 2019 09:35:16: 15000000 INFO @ Mon, 03 Jun 2019 09:35:17: 14000000 INFO @ Mon, 03 Jun 2019 09:35:18: 14000000 INFO @ Mon, 03 Jun 2019 09:35:25: 16000000 INFO @ Mon, 03 Jun 2019 09:35:26: 15000000 INFO @ Mon, 03 Jun 2019 09:35:28: 15000000 INFO @ Mon, 03 Jun 2019 09:35:31: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 09:35:31: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 09:35:31: #1 total tags in treatment: 16783752 INFO @ Mon, 03 Jun 2019 09:35:31: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 09:35:31: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 09:35:32: #1 tags after filtering in treatment: 16783752 INFO @ Mon, 03 Jun 2019 09:35:32: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 09:35:32: #1 finished! INFO @ Mon, 03 Jun 2019 09:35:32: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 09:35:32: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 09:35:33: #2 number of paired peaks: 348 WARNING @ Mon, 03 Jun 2019 09:35:33: Fewer paired peaks (348) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 348 pairs to build model! INFO @ Mon, 03 Jun 2019 09:35:33: start model_add_line... INFO @ Mon, 03 Jun 2019 09:35:33: start X-correlation... INFO @ Mon, 03 Jun 2019 09:35:33: end of X-cor INFO @ Mon, 03 Jun 2019 09:35:33: #2 finished! INFO @ Mon, 03 Jun 2019 09:35:33: #2 predicted fragment length is 167 bps INFO @ Mon, 03 Jun 2019 09:35:33: #2 alternative fragment length(s) may be 2,118,145,167,195,282 bps INFO @ Mon, 03 Jun 2019 09:35:33: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.05_model.r INFO @ Mon, 03 Jun 2019 09:35:33: #3 Call peaks... INFO @ Mon, 03 Jun 2019 09:35:33: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 09:35:36: 16000000 INFO @ Mon, 03 Jun 2019 09:35:37: 16000000 INFO @ Mon, 03 Jun 2019 09:35:43: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 09:35:43: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 09:35:43: #1 total tags in treatment: 16783752 INFO @ Mon, 03 Jun 2019 09:35:43: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 09:35:43: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 09:35:44: #1 tags after filtering in treatment: 16783752 INFO @ Mon, 03 Jun 2019 09:35:44: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 09:35:44: #1 finished! INFO @ Mon, 03 Jun 2019 09:35:44: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 09:35:44: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 09:35:45: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 09:35:45: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 09:35:45: #1 total tags in treatment: 16783752 INFO @ Mon, 03 Jun 2019 09:35:45: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 09:35:45: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 09:35:45: #1 tags after filtering in treatment: 16783752 INFO @ Mon, 03 Jun 2019 09:35:45: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 09:35:45: #1 finished! INFO @ Mon, 03 Jun 2019 09:35:45: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 09:35:45: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 09:35:45: #2 number of paired peaks: 348 WARNING @ Mon, 03 Jun 2019 09:35:45: Fewer paired peaks (348) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 348 pairs to build model! INFO @ Mon, 03 Jun 2019 09:35:45: start model_add_line... INFO @ Mon, 03 Jun 2019 09:35:45: start X-correlation... INFO @ Mon, 03 Jun 2019 09:35:45: end of X-cor INFO @ Mon, 03 Jun 2019 09:35:45: #2 finished! INFO @ Mon, 03 Jun 2019 09:35:45: #2 predicted fragment length is 167 bps INFO @ Mon, 03 Jun 2019 09:35:45: #2 alternative fragment length(s) may be 2,118,145,167,195,282 bps INFO @ Mon, 03 Jun 2019 09:35:45: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.10_model.r INFO @ Mon, 03 Jun 2019 09:35:45: #3 Call peaks... INFO @ Mon, 03 Jun 2019 09:35:45: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 09:35:47: #2 number of paired peaks: 348 WARNING @ Mon, 03 Jun 2019 09:35:47: Fewer paired peaks (348) than 1000! Model may not be build well! Lower your MFOLD parameter may erase this warning. Now I will use 348 pairs to build model! INFO @ Mon, 03 Jun 2019 09:35:47: start model_add_line... INFO @ Mon, 03 Jun 2019 09:35:47: start X-correlation... INFO @ Mon, 03 Jun 2019 09:35:47: end of X-cor INFO @ Mon, 03 Jun 2019 09:35:47: #2 finished! INFO @ Mon, 03 Jun 2019 09:35:47: #2 predicted fragment length is 167 bps INFO @ Mon, 03 Jun 2019 09:35:47: #2 alternative fragment length(s) may be 2,118,145,167,195,282 bps INFO @ Mon, 03 Jun 2019 09:35:47: #2.2 Generate R script for model : /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.20_model.r INFO @ Mon, 03 Jun 2019 09:35:47: #3 Call peaks... INFO @ Mon, 03 Jun 2019 09:35:47: #3 Pre-compute pvalue-qvalue table... INFO @ Mon, 03 Jun 2019 09:36:20: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 09:36:33: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 09:36:34: #3 Call peaks for each chromosome... INFO @ Mon, 03 Jun 2019 09:36:42: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.05_peaks.xls INFO @ Mon, 03 Jun 2019 09:36:42: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.05_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 09:36:42: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.05_summits.bed INFO @ Mon, 03 Jun 2019 09:36:42: Done! pass1 - making usageList (14 chroms): 2 millis pass2 - checking and writing primary data (2946 records, 4 fields): 8 millis CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 09:36:55: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.10_peaks.xls INFO @ Mon, 03 Jun 2019 09:36:55: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.10_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 09:36:55: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.10_summits.bed INFO @ Mon, 03 Jun 2019 09:36:55: Done! pass1 - making usageList (14 chroms): 1 millis pass2 - checking and writing primary data (1272 records, 4 fields): 6 millis CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 09:36:57: #4 Write output xls file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.20_peaks.xls INFO @ Mon, 03 Jun 2019 09:36:57: #4 Write peak in narrowPeak format file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.20_peaks.narrowPeak INFO @ Mon, 03 Jun 2019 09:36:57: #4 Write summits bed file... /home/okishinya/chipatlas/results/dm3/SRX287768/SRX287768.20_summits.bed INFO @ Mon, 03 Jun 2019 09:36:57: Done! pass1 - making usageList (13 chroms): 1 millis pass2 - checking and writing primary data (508 records, 4 fields): 2 millis CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。