Job ID = 1293775 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-02T17:17:57 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T17:18:17 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T17:20:58 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T17:20:58 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 19,804,394 reads read : 19,804,394 reads written : 19,804,394 spots read : 17,014,202 reads read : 17,014,202 reads written : 17,014,202 rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:13:59 36818596 reads; of these: 36818596 (100.00%) were unpaired; of these: 513638 (1.40%) aligned 0 times 26286560 (71.39%) aligned exactly 1 time 10018398 (27.21%) aligned >1 times 98.60% overall alignment rate Time searching: 00:13:59 Overall time: 00:13:59 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 12 files... [bam_rmdupse_core] 12607797 / 36304958 = 0.3473 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 02:59:24: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 02:59:24: #1 read tag files... INFO @ Mon, 03 Jun 2019 02:59:24: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 02:59:24: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 02:59:24: #1 read tag files... INFO @ Mon, 03 Jun 2019 02:59:24: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 02:59:24: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 02:59:24: #1 read tag files... INFO @ Mon, 03 Jun 2019 02:59:24: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 02:59:32: 1000000 INFO @ Mon, 03 Jun 2019 02:59:32: 1000000 INFO @ Mon, 03 Jun 2019 02:59:33: 1000000 INFO @ Mon, 03 Jun 2019 02:59:40: 2000000 INFO @ Mon, 03 Jun 2019 02:59:41: 2000000 INFO @ Mon, 03 Jun 2019 02:59:42: 2000000 INFO @ Mon, 03 Jun 2019 02:59:48: 3000000 INFO @ Mon, 03 Jun 2019 02:59:49: 3000000 INFO @ Mon, 03 Jun 2019 02:59:51: 3000000 INFO @ Mon, 03 Jun 2019 02:59:57: 4000000 INFO @ Mon, 03 Jun 2019 02:59:57: 4000000 INFO @ Mon, 03 Jun 2019 03:00:00: 4000000 INFO @ Mon, 03 Jun 2019 03:00:05: 5000000 INFO @ Mon, 03 Jun 2019 03:00:06: 5000000 INFO @ Mon, 03 Jun 2019 03:00:09: 5000000 INFO @ Mon, 03 Jun 2019 03:00:14: 6000000 INFO @ Mon, 03 Jun 2019 03:00:14: 6000000 INFO @ Mon, 03 Jun 2019 03:00:18: 6000000 INFO @ Mon, 03 Jun 2019 03:00:22: 7000000 INFO @ Mon, 03 Jun 2019 03:00:22: 7000000 INFO @ Mon, 03 Jun 2019 03:00:27: 7000000 INFO @ Mon, 03 Jun 2019 03:00:30: 8000000 INFO @ Mon, 03 Jun 2019 03:00:30: 8000000 INFO @ Mon, 03 Jun 2019 03:00:36: 8000000 INFO @ Mon, 03 Jun 2019 03:00:38: 9000000 INFO @ Mon, 03 Jun 2019 03:00:38: 9000000 INFO @ Mon, 03 Jun 2019 03:00:45: 9000000 INFO @ Mon, 03 Jun 2019 03:00:46: 10000000 INFO @ Mon, 03 Jun 2019 03:00:47: 10000000 INFO @ Mon, 03 Jun 2019 03:00:54: 10000000 INFO @ Mon, 03 Jun 2019 03:00:54: 11000000 INFO @ Mon, 03 Jun 2019 03:00:55: 11000000 INFO @ Mon, 03 Jun 2019 03:01:03: 12000000 INFO @ Mon, 03 Jun 2019 03:01:03: 12000000 INFO @ Mon, 03 Jun 2019 03:01:03: 11000000 INFO @ Mon, 03 Jun 2019 03:01:11: 13000000 INFO @ Mon, 03 Jun 2019 03:01:11: 13000000 INFO @ Mon, 03 Jun 2019 03:01:12: 12000000 INFO @ Mon, 03 Jun 2019 03:01:19: 14000000 INFO @ Mon, 03 Jun 2019 03:01:19: 14000000 INFO @ Mon, 03 Jun 2019 03:01:21: 13000000 INFO @ Mon, 03 Jun 2019 03:01:28: 15000000 INFO @ Mon, 03 Jun 2019 03:01:29: 15000000 INFO @ Mon, 03 Jun 2019 03:01:31: 14000000 INFO @ Mon, 03 Jun 2019 03:01:38: 16000000 INFO @ Mon, 03 Jun 2019 03:01:39: 16000000 INFO @ Mon, 03 Jun 2019 03:01:41: 15000000 INFO @ Mon, 03 Jun 2019 03:01:48: 17000000 INFO @ Mon, 03 Jun 2019 03:01:49: 17000000 INFO @ Mon, 03 Jun 2019 03:01:50: 16000000 INFO @ Mon, 03 Jun 2019 03:01:57: 18000000 INFO @ Mon, 03 Jun 2019 03:01:58: 18000000 INFO @ Mon, 03 Jun 2019 03:02:00: 17000000 INFO @ Mon, 03 Jun 2019 03:02:05: 19000000 INFO @ Mon, 03 Jun 2019 03:02:06: 19000000 INFO @ Mon, 03 Jun 2019 03:02:10: 18000000 INFO @ Mon, 03 Jun 2019 03:02:13: 20000000 INFO @ Mon, 03 Jun 2019 03:02:14: 20000000 INFO @ Mon, 03 Jun 2019 03:02:20: 19000000 INFO @ Mon, 03 Jun 2019 03:02:21: 21000000 INFO @ Mon, 03 Jun 2019 03:02:22: 21000000 INFO @ Mon, 03 Jun 2019 03:02:29: 22000000 INFO @ Mon, 03 Jun 2019 03:02:30: 20000000 INFO @ Mon, 03 Jun 2019 03:02:30: 22000000 INFO @ Mon, 03 Jun 2019 03:02:36: 23000000 INFO @ Mon, 03 Jun 2019 03:02:38: 23000000 INFO @ Mon, 03 Jun 2019 03:02:39: 21000000 INFO @ Mon, 03 Jun 2019 03:02:42: #1 tag size is determined as 44 bps INFO @ Mon, 03 Jun 2019 03:02:42: #1 tag size = 44 INFO @ Mon, 03 Jun 2019 03:02:42: #1 total tags in treatment: 23697161 INFO @ Mon, 03 Jun 2019 03:02:42: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 03:02:42: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 03:02:43: #1 tags after filtering in treatment: 23697161 INFO @ Mon, 03 Jun 2019 03:02:43: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 03:02:43: #1 finished! INFO @ Mon, 03 Jun 2019 03:02:43: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 03:02:43: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 03:02:44: #1 tag size is determined as 44 bps INFO @ Mon, 03 Jun 2019 03:02:44: #1 tag size = 44 INFO @ Mon, 03 Jun 2019 03:02:44: #1 total tags in treatment: 23697161 INFO @ Mon, 03 Jun 2019 03:02:44: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 03:02:44: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 03:02:45: #1 tags after filtering in treatment: 23697161 INFO @ Mon, 03 Jun 2019 03:02:45: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 03:02:45: #1 finished! INFO @ Mon, 03 Jun 2019 03:02:45: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 03:02:45: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 03:02:45: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 03:02:45: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 03:02:45: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 03:02:47: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 03:02:47: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 03:02:47: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 03:02:49: 22000000 INFO @ Mon, 03 Jun 2019 03:02:58: 23000000 INFO @ Mon, 03 Jun 2019 03:03:04: #1 tag size is determined as 44 bps INFO @ Mon, 03 Jun 2019 03:03:04: #1 tag size = 44 INFO @ Mon, 03 Jun 2019 03:03:04: #1 total tags in treatment: 23697161 INFO @ Mon, 03 Jun 2019 03:03:04: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 03:03:04: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 03:03:05: #1 tags after filtering in treatment: 23697161 INFO @ Mon, 03 Jun 2019 03:03:05: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 03:03:05: #1 finished! INFO @ Mon, 03 Jun 2019 03:03:05: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 03:03:05: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 03:03:07: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 03:03:07: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 03:03:07: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX113319/SRX113319.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。