Job ID = 1293966 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-02T18:38:30 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T18:38:30 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 55,261,147 reads read : 55,261,147 reads written : 55,261,147 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:15:52 55261147 reads; of these: 55261147 (100.00%) were unpaired; of these: 1066055 (1.93%) aligned 0 times 45427713 (82.21%) aligned exactly 1 time 8767379 (15.87%) aligned >1 times 98.07% overall alignment rate Time searching: 00:15:52 Overall time: 00:15:52 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 24 files... [bam_rmdupse_core] 16999280 / 54195092 = 0.3137 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 04:36:04: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 04:36:04: #1 read tag files... INFO @ Mon, 03 Jun 2019 04:36:04: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 04:36:04: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 04:36:04: #1 read tag files... INFO @ Mon, 03 Jun 2019 04:36:04: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 04:36:04: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 04:36:04: #1 read tag files... INFO @ Mon, 03 Jun 2019 04:36:04: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 04:36:12: 1000000 INFO @ Mon, 03 Jun 2019 04:36:12: 1000000 INFO @ Mon, 03 Jun 2019 04:36:12: 1000000 INFO @ Mon, 03 Jun 2019 04:36:20: 2000000 INFO @ Mon, 03 Jun 2019 04:36:21: 2000000 INFO @ Mon, 03 Jun 2019 04:36:21: 2000000 INFO @ Mon, 03 Jun 2019 04:36:28: 3000000 INFO @ Mon, 03 Jun 2019 04:36:29: 3000000 INFO @ Mon, 03 Jun 2019 04:36:29: 3000000 INFO @ Mon, 03 Jun 2019 04:36:36: 4000000 INFO @ Mon, 03 Jun 2019 04:36:37: 4000000 INFO @ Mon, 03 Jun 2019 04:36:38: 4000000 INFO @ Mon, 03 Jun 2019 04:36:44: 5000000 INFO @ Mon, 03 Jun 2019 04:36:45: 5000000 INFO @ Mon, 03 Jun 2019 04:36:46: 5000000 INFO @ Mon, 03 Jun 2019 04:36:52: 6000000 INFO @ Mon, 03 Jun 2019 04:36:54: 6000000 INFO @ Mon, 03 Jun 2019 04:36:54: 6000000 INFO @ Mon, 03 Jun 2019 04:37:01: 7000000 INFO @ Mon, 03 Jun 2019 04:37:03: 7000000 INFO @ Mon, 03 Jun 2019 04:37:03: 7000000 INFO @ Mon, 03 Jun 2019 04:37:09: 8000000 INFO @ Mon, 03 Jun 2019 04:37:11: 8000000 INFO @ Mon, 03 Jun 2019 04:37:11: 8000000 INFO @ Mon, 03 Jun 2019 04:37:17: 9000000 INFO @ Mon, 03 Jun 2019 04:37:19: 9000000 INFO @ Mon, 03 Jun 2019 04:37:20: 9000000 INFO @ Mon, 03 Jun 2019 04:37:26: 10000000 INFO @ Mon, 03 Jun 2019 04:37:28: 10000000 INFO @ Mon, 03 Jun 2019 04:37:29: 10000000 INFO @ Mon, 03 Jun 2019 04:37:35: 11000000 INFO @ Mon, 03 Jun 2019 04:37:37: 11000000 INFO @ Mon, 03 Jun 2019 04:37:37: 11000000 INFO @ Mon, 03 Jun 2019 04:37:43: 12000000 INFO @ Mon, 03 Jun 2019 04:37:45: 12000000 INFO @ Mon, 03 Jun 2019 04:37:46: 12000000 INFO @ Mon, 03 Jun 2019 04:37:52: 13000000 INFO @ Mon, 03 Jun 2019 04:37:54: 13000000 INFO @ Mon, 03 Jun 2019 04:37:54: 13000000 INFO @ Mon, 03 Jun 2019 04:38:01: 14000000 INFO @ Mon, 03 Jun 2019 04:38:02: 14000000 INFO @ Mon, 03 Jun 2019 04:38:03: 14000000 INFO @ Mon, 03 Jun 2019 04:38:10: 15000000 INFO @ Mon, 03 Jun 2019 04:38:11: 15000000 INFO @ Mon, 03 Jun 2019 04:38:11: 15000000 INFO @ Mon, 03 Jun 2019 04:38:19: 16000000 INFO @ Mon, 03 Jun 2019 04:38:19: 16000000 INFO @ Mon, 03 Jun 2019 04:38:19: 16000000 INFO @ Mon, 03 Jun 2019 04:38:28: 17000000 INFO @ Mon, 03 Jun 2019 04:38:28: 17000000 INFO @ Mon, 03 Jun 2019 04:38:28: 17000000 INFO @ Mon, 03 Jun 2019 04:38:36: 18000000 INFO @ Mon, 03 Jun 2019 04:38:37: 18000000 INFO @ Mon, 03 Jun 2019 04:38:38: 18000000 INFO @ Mon, 03 Jun 2019 04:38:44: 19000000 INFO @ Mon, 03 Jun 2019 04:38:45: 19000000 INFO @ Mon, 03 Jun 2019 04:38:47: 19000000 INFO @ Mon, 03 Jun 2019 04:38:52: 20000000 INFO @ Mon, 03 Jun 2019 04:38:53: 20000000 INFO @ Mon, 03 Jun 2019 04:38:56: 20000000 INFO @ Mon, 03 Jun 2019 04:39:00: 21000000 INFO @ Mon, 03 Jun 2019 04:39:01: 21000000 INFO @ Mon, 03 Jun 2019 04:39:05: 21000000 INFO @ Mon, 03 Jun 2019 04:39:08: 22000000 INFO @ Mon, 03 Jun 2019 04:39:09: 22000000 INFO @ Mon, 03 Jun 2019 04:39:14: 22000000 INFO @ Mon, 03 Jun 2019 04:39:16: 23000000 INFO @ Mon, 03 Jun 2019 04:39:17: 23000000 INFO @ Mon, 03 Jun 2019 04:39:23: 23000000 INFO @ Mon, 03 Jun 2019 04:39:24: 24000000 INFO @ Mon, 03 Jun 2019 04:39:25: 24000000 INFO @ Mon, 03 Jun 2019 04:39:31: 25000000 INFO @ Mon, 03 Jun 2019 04:39:32: 24000000 INFO @ Mon, 03 Jun 2019 04:39:33: 25000000 INFO @ Mon, 03 Jun 2019 04:39:39: 26000000 INFO @ Mon, 03 Jun 2019 04:39:41: 25000000 INFO @ Mon, 03 Jun 2019 04:39:41: 26000000 INFO @ Mon, 03 Jun 2019 04:39:47: 27000000 INFO @ Mon, 03 Jun 2019 04:39:49: 27000000 INFO @ Mon, 03 Jun 2019 04:39:50: 26000000 INFO @ Mon, 03 Jun 2019 04:39:55: 28000000 INFO @ Mon, 03 Jun 2019 04:39:57: 28000000 INFO @ Mon, 03 Jun 2019 04:39:58: 27000000 INFO @ Mon, 03 Jun 2019 04:40:03: 29000000 INFO @ Mon, 03 Jun 2019 04:40:05: 29000000 INFO @ Mon, 03 Jun 2019 04:40:07: 28000000 INFO @ Mon, 03 Jun 2019 04:40:11: 30000000 INFO @ Mon, 03 Jun 2019 04:40:13: 30000000 INFO @ Mon, 03 Jun 2019 04:40:16: 29000000 INFO @ Mon, 03 Jun 2019 04:40:19: 31000000 INFO @ Mon, 03 Jun 2019 04:40:21: 31000000 INFO @ Mon, 03 Jun 2019 04:40:25: 30000000 INFO @ Mon, 03 Jun 2019 04:40:27: 32000000 INFO @ Mon, 03 Jun 2019 04:40:29: 32000000 INFO @ Mon, 03 Jun 2019 04:40:33: 31000000 INFO @ Mon, 03 Jun 2019 04:40:34: 33000000 INFO @ Mon, 03 Jun 2019 04:40:37: 33000000 INFO @ Mon, 03 Jun 2019 04:40:42: 34000000 INFO @ Mon, 03 Jun 2019 04:40:42: 32000000 INFO @ Mon, 03 Jun 2019 04:40:45: 34000000 INFO @ Mon, 03 Jun 2019 04:40:50: 35000000 INFO @ Mon, 03 Jun 2019 04:40:51: 33000000 INFO @ Mon, 03 Jun 2019 04:40:53: 35000000 INFO @ Mon, 03 Jun 2019 04:40:58: 36000000 INFO @ Mon, 03 Jun 2019 04:41:00: 34000000 INFO @ Mon, 03 Jun 2019 04:41:01: 36000000 INFO @ Mon, 03 Jun 2019 04:41:06: 37000000 INFO @ Mon, 03 Jun 2019 04:41:07: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 04:41:07: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 04:41:07: #1 total tags in treatment: 37195812 INFO @ Mon, 03 Jun 2019 04:41:07: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 04:41:07: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 04:41:08: #1 tags after filtering in treatment: 37195812 INFO @ Mon, 03 Jun 2019 04:41:08: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 04:41:08: #1 finished! INFO @ Mon, 03 Jun 2019 04:41:08: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 04:41:08: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 04:41:08: 35000000 INFO @ Mon, 03 Jun 2019 04:41:08: 37000000 INFO @ Mon, 03 Jun 2019 04:41:10: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 04:41:10: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 04:41:10: #1 total tags in treatment: 37195812 INFO @ Mon, 03 Jun 2019 04:41:10: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 04:41:10: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 04:41:11: #1 tags after filtering in treatment: 37195812 INFO @ Mon, 03 Jun 2019 04:41:11: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 04:41:11: #1 finished! INFO @ Mon, 03 Jun 2019 04:41:11: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 04:41:11: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 04:41:11: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 04:41:11: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 04:41:11: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 04:41:14: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 04:41:14: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 04:41:14: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 04:41:17: 36000000 INFO @ Mon, 03 Jun 2019 04:41:25: 37000000 INFO @ Mon, 03 Jun 2019 04:41:27: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 04:41:27: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 04:41:27: #1 total tags in treatment: 37195812 INFO @ Mon, 03 Jun 2019 04:41:27: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 04:41:27: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 04:41:28: #1 tags after filtering in treatment: 37195812 INFO @ Mon, 03 Jun 2019 04:41:28: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 04:41:28: #1 finished! INFO @ Mon, 03 Jun 2019 04:41:28: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 04:41:28: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 04:41:31: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 04:41:31: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 04:41:31: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX152098/SRX152098.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。