Job ID = 1298842 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-03T08:35:21 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 1900-01-00T00:00:00 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:37:17 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:38:58 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:38:58 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:40:02 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:41:21 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:41:52 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:41:52 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:41:52 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:42:37 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:48:21 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:48:21 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T08:50:09 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 37,021,579 reads read : 37,021,579 reads written : 37,021,579 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:14:07 37021579 reads; of these: 37021579 (100.00%) were unpaired; of these: 1597982 (4.32%) aligned 0 times 26316918 (71.09%) aligned exactly 1 time 9106679 (24.60%) aligned >1 times 95.68% overall alignment rate Time searching: 00:14:07 Overall time: 00:14:07 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 16 files... [bam_rmdupse_core] 9397041 / 35423597 = 0.2653 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 18:20:18: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 18:20:18: #1 read tag files... INFO @ Mon, 03 Jun 2019 18:20:18: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 18:20:18: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 18:20:18: #1 read tag files... INFO @ Mon, 03 Jun 2019 18:20:18: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 18:20:18: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 18:20:18: #1 read tag files... INFO @ Mon, 03 Jun 2019 18:20:18: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 18:20:25: 1000000 INFO @ Mon, 03 Jun 2019 18:20:26: 1000000 INFO @ Mon, 03 Jun 2019 18:20:28: 1000000 INFO @ Mon, 03 Jun 2019 18:20:32: 2000000 INFO @ Mon, 03 Jun 2019 18:20:34: 2000000 INFO @ Mon, 03 Jun 2019 18:20:37: 2000000 INFO @ Mon, 03 Jun 2019 18:20:39: 3000000 INFO @ Mon, 03 Jun 2019 18:20:41: 3000000 INFO @ Mon, 03 Jun 2019 18:20:46: 3000000 INFO @ Mon, 03 Jun 2019 18:20:46: 4000000 INFO @ Mon, 03 Jun 2019 18:20:48: 4000000 INFO @ Mon, 03 Jun 2019 18:20:53: 5000000 INFO @ Mon, 03 Jun 2019 18:20:54: 4000000 INFO @ Mon, 03 Jun 2019 18:20:55: 5000000 INFO @ Mon, 03 Jun 2019 18:21:00: 6000000 INFO @ Mon, 03 Jun 2019 18:21:02: 6000000 INFO @ Mon, 03 Jun 2019 18:21:03: 5000000 INFO @ Mon, 03 Jun 2019 18:21:07: 7000000 INFO @ Mon, 03 Jun 2019 18:21:09: 7000000 INFO @ Mon, 03 Jun 2019 18:21:12: 6000000 INFO @ Mon, 03 Jun 2019 18:21:14: 8000000 INFO @ Mon, 03 Jun 2019 18:21:16: 8000000 INFO @ Mon, 03 Jun 2019 18:21:21: 7000000 INFO @ Mon, 03 Jun 2019 18:21:21: 9000000 INFO @ Mon, 03 Jun 2019 18:21:23: 9000000 INFO @ Mon, 03 Jun 2019 18:21:28: 10000000 INFO @ Mon, 03 Jun 2019 18:21:30: 8000000 INFO @ Mon, 03 Jun 2019 18:21:30: 10000000 INFO @ Mon, 03 Jun 2019 18:21:35: 11000000 INFO @ Mon, 03 Jun 2019 18:21:37: 11000000 INFO @ Mon, 03 Jun 2019 18:21:38: 9000000 INFO @ Mon, 03 Jun 2019 18:21:42: 12000000 INFO @ Mon, 03 Jun 2019 18:21:44: 12000000 INFO @ Mon, 03 Jun 2019 18:21:47: 10000000 INFO @ Mon, 03 Jun 2019 18:21:49: 13000000 INFO @ Mon, 03 Jun 2019 18:21:51: 13000000 INFO @ Mon, 03 Jun 2019 18:21:56: 14000000 INFO @ Mon, 03 Jun 2019 18:21:56: 11000000 INFO @ Mon, 03 Jun 2019 18:21:58: 14000000 INFO @ Mon, 03 Jun 2019 18:22:03: 15000000 INFO @ Mon, 03 Jun 2019 18:22:05: 15000000 INFO @ Mon, 03 Jun 2019 18:22:05: 12000000 INFO @ Mon, 03 Jun 2019 18:22:10: 16000000 INFO @ Mon, 03 Jun 2019 18:22:12: 16000000 INFO @ Mon, 03 Jun 2019 18:22:14: 13000000 INFO @ Mon, 03 Jun 2019 18:22:17: 17000000 INFO @ Mon, 03 Jun 2019 18:22:19: 17000000 INFO @ Mon, 03 Jun 2019 18:22:22: 14000000 INFO @ Mon, 03 Jun 2019 18:22:24: 18000000 INFO @ Mon, 03 Jun 2019 18:22:26: 18000000 INFO @ Mon, 03 Jun 2019 18:22:31: 19000000 INFO @ Mon, 03 Jun 2019 18:22:31: 15000000 INFO @ Mon, 03 Jun 2019 18:22:33: 19000000 INFO @ Mon, 03 Jun 2019 18:22:38: 20000000 INFO @ Mon, 03 Jun 2019 18:22:40: 20000000 INFO @ Mon, 03 Jun 2019 18:22:40: 16000000 INFO @ Mon, 03 Jun 2019 18:22:45: 21000000 INFO @ Mon, 03 Jun 2019 18:22:47: 21000000 INFO @ Mon, 03 Jun 2019 18:22:49: 17000000 INFO @ Mon, 03 Jun 2019 18:22:52: 22000000 INFO @ Mon, 03 Jun 2019 18:22:54: 22000000 INFO @ Mon, 03 Jun 2019 18:22:57: 18000000 INFO @ Mon, 03 Jun 2019 18:22:59: 23000000 INFO @ Mon, 03 Jun 2019 18:23:01: 23000000 INFO @ Mon, 03 Jun 2019 18:23:06: 24000000 INFO @ Mon, 03 Jun 2019 18:23:06: 19000000 INFO @ Mon, 03 Jun 2019 18:23:08: 24000000 INFO @ Mon, 03 Jun 2019 18:23:13: 25000000 INFO @ Mon, 03 Jun 2019 18:23:15: 25000000 INFO @ Mon, 03 Jun 2019 18:23:15: 20000000 INFO @ Mon, 03 Jun 2019 18:23:20: 26000000 INFO @ Mon, 03 Jun 2019 18:23:20: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 18:23:20: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 18:23:20: #1 total tags in treatment: 26026556 INFO @ Mon, 03 Jun 2019 18:23:20: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 18:23:20: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 18:23:21: #1 tags after filtering in treatment: 26026556 INFO @ Mon, 03 Jun 2019 18:23:21: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 18:23:21: #1 finished! INFO @ Mon, 03 Jun 2019 18:23:21: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 18:23:21: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 18:23:22: 26000000 INFO @ Mon, 03 Jun 2019 18:23:22: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 18:23:22: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 18:23:22: #1 total tags in treatment: 26026556 INFO @ Mon, 03 Jun 2019 18:23:22: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 18:23:22: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 18:23:23: #1 tags after filtering in treatment: 26026556 INFO @ Mon, 03 Jun 2019 18:23:23: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 18:23:23: #1 finished! INFO @ Mon, 03 Jun 2019 18:23:23: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 18:23:23: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 18:23:23: #2 number of paired peaks: 22 WARNING @ Mon, 03 Jun 2019 18:23:23: Too few paired peaks (22) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 18:23:23: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 18:23:24: 21000000 INFO @ Mon, 03 Jun 2019 18:23:25: #2 number of paired peaks: 22 WARNING @ Mon, 03 Jun 2019 18:23:25: Too few paired peaks (22) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 18:23:25: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 18:23:33: 22000000 INFO @ Mon, 03 Jun 2019 18:23:41: 23000000 INFO @ Mon, 03 Jun 2019 18:23:50: 24000000 INFO @ Mon, 03 Jun 2019 18:23:58: 25000000 INFO @ Mon, 03 Jun 2019 18:24:07: 26000000 INFO @ Mon, 03 Jun 2019 18:24:07: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 18:24:07: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 18:24:07: #1 total tags in treatment: 26026556 INFO @ Mon, 03 Jun 2019 18:24:07: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 18:24:07: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 18:24:08: #1 tags after filtering in treatment: 26026556 INFO @ Mon, 03 Jun 2019 18:24:08: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 18:24:08: #1 finished! INFO @ Mon, 03 Jun 2019 18:24:08: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 18:24:08: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 18:24:10: #2 number of paired peaks: 22 WARNING @ Mon, 03 Jun 2019 18:24:10: Too few paired peaks (22) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 18:24:10: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX467054/SRX467054.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。