Job ID = 1293623 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-02T16:08:18 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T16:08:18 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T16:08:18 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 44,852,378 reads read : 44,852,378 reads written : 44,852,378 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:20:27 44852378 reads; of these: 44852378 (100.00%) were unpaired; of these: 1182257 (2.64%) aligned 0 times 30691574 (68.43%) aligned exactly 1 time 12978547 (28.94%) aligned >1 times 97.36% overall alignment rate Time searching: 00:20:27 Overall time: 00:20:27 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 20 files... [bam_rmdupse_core] 7763886 / 43670121 = 0.1778 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 01:51:21: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 01:51:21: #1 read tag files... INFO @ Mon, 03 Jun 2019 01:51:21: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 01:51:21: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 01:51:21: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 01:51:21: #1 read tag files... INFO @ Mon, 03 Jun 2019 01:51:21: #1 read tag files... INFO @ Mon, 03 Jun 2019 01:51:21: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 01:51:21: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 01:51:30: 1000000 INFO @ Mon, 03 Jun 2019 01:51:30: 1000000 INFO @ Mon, 03 Jun 2019 01:51:30: 1000000 INFO @ Mon, 03 Jun 2019 01:51:39: 2000000 INFO @ Mon, 03 Jun 2019 01:51:39: 2000000 INFO @ Mon, 03 Jun 2019 01:51:39: 2000000 INFO @ Mon, 03 Jun 2019 01:51:47: 3000000 INFO @ Mon, 03 Jun 2019 01:51:48: 3000000 INFO @ Mon, 03 Jun 2019 01:51:48: 3000000 INFO @ Mon, 03 Jun 2019 01:51:56: 4000000 INFO @ Mon, 03 Jun 2019 01:51:57: 4000000 INFO @ Mon, 03 Jun 2019 01:51:57: 4000000 INFO @ Mon, 03 Jun 2019 01:52:04: 5000000 INFO @ Mon, 03 Jun 2019 01:52:05: 5000000 INFO @ Mon, 03 Jun 2019 01:52:05: 5000000 INFO @ Mon, 03 Jun 2019 01:52:12: 6000000 INFO @ Mon, 03 Jun 2019 01:52:14: 6000000 INFO @ Mon, 03 Jun 2019 01:52:14: 6000000 INFO @ Mon, 03 Jun 2019 01:52:21: 7000000 INFO @ Mon, 03 Jun 2019 01:52:22: 7000000 INFO @ Mon, 03 Jun 2019 01:52:22: 7000000 INFO @ Mon, 03 Jun 2019 01:52:29: 8000000 INFO @ Mon, 03 Jun 2019 01:52:31: 8000000 INFO @ Mon, 03 Jun 2019 01:52:31: 8000000 INFO @ Mon, 03 Jun 2019 01:52:37: 9000000 INFO @ Mon, 03 Jun 2019 01:52:39: 9000000 INFO @ Mon, 03 Jun 2019 01:52:39: 9000000 INFO @ Mon, 03 Jun 2019 01:52:46: 10000000 INFO @ Mon, 03 Jun 2019 01:52:48: 10000000 INFO @ Mon, 03 Jun 2019 01:52:48: 10000000 INFO @ Mon, 03 Jun 2019 01:52:54: 11000000 INFO @ Mon, 03 Jun 2019 01:52:56: 11000000 INFO @ Mon, 03 Jun 2019 01:52:56: 11000000 INFO @ Mon, 03 Jun 2019 01:53:02: 12000000 INFO @ Mon, 03 Jun 2019 01:53:05: 12000000 INFO @ Mon, 03 Jun 2019 01:53:05: 12000000 INFO @ Mon, 03 Jun 2019 01:53:11: 13000000 INFO @ Mon, 03 Jun 2019 01:53:14: 13000000 INFO @ Mon, 03 Jun 2019 01:53:14: 13000000 INFO @ Mon, 03 Jun 2019 01:53:19: 14000000 INFO @ Mon, 03 Jun 2019 01:53:22: 14000000 INFO @ Mon, 03 Jun 2019 01:53:22: 14000000 INFO @ Mon, 03 Jun 2019 01:53:28: 15000000 INFO @ Mon, 03 Jun 2019 01:53:31: 15000000 INFO @ Mon, 03 Jun 2019 01:53:31: 15000000 INFO @ Mon, 03 Jun 2019 01:53:36: 16000000 INFO @ Mon, 03 Jun 2019 01:53:39: 16000000 INFO @ Mon, 03 Jun 2019 01:53:39: 16000000 INFO @ Mon, 03 Jun 2019 01:53:45: 17000000 INFO @ Mon, 03 Jun 2019 01:53:46: 17000000 INFO @ Mon, 03 Jun 2019 01:53:46: 17000000 INFO @ Mon, 03 Jun 2019 01:53:53: 18000000 INFO @ Mon, 03 Jun 2019 01:53:53: 18000000 INFO @ Mon, 03 Jun 2019 01:53:54: 18000000 INFO @ Mon, 03 Jun 2019 01:54:00: 19000000 INFO @ Mon, 03 Jun 2019 01:54:01: 19000000 INFO @ Mon, 03 Jun 2019 01:54:02: 19000000 INFO @ Mon, 03 Jun 2019 01:54:08: 20000000 INFO @ Mon, 03 Jun 2019 01:54:09: 20000000 INFO @ Mon, 03 Jun 2019 01:54:10: 20000000 INFO @ Mon, 03 Jun 2019 01:54:15: 21000000 INFO @ Mon, 03 Jun 2019 01:54:17: 21000000 INFO @ Mon, 03 Jun 2019 01:54:17: 21000000 INFO @ Mon, 03 Jun 2019 01:54:22: 22000000 INFO @ Mon, 03 Jun 2019 01:54:25: 22000000 INFO @ Mon, 03 Jun 2019 01:54:25: 22000000 INFO @ Mon, 03 Jun 2019 01:54:29: 23000000 INFO @ Mon, 03 Jun 2019 01:54:33: 23000000 INFO @ Mon, 03 Jun 2019 01:54:33: 23000000 INFO @ Mon, 03 Jun 2019 01:54:36: 24000000 INFO @ Mon, 03 Jun 2019 01:54:41: 24000000 INFO @ Mon, 03 Jun 2019 01:54:41: 24000000 INFO @ Mon, 03 Jun 2019 01:54:43: 25000000 INFO @ Mon, 03 Jun 2019 01:54:49: 25000000 INFO @ Mon, 03 Jun 2019 01:54:49: 25000000 INFO @ Mon, 03 Jun 2019 01:54:50: 26000000 INFO @ Mon, 03 Jun 2019 01:54:57: 26000000 INFO @ Mon, 03 Jun 2019 01:54:57: 26000000 INFO @ Mon, 03 Jun 2019 01:54:57: 27000000 INFO @ Mon, 03 Jun 2019 01:55:04: 27000000 INFO @ Mon, 03 Jun 2019 01:55:05: 28000000 INFO @ Mon, 03 Jun 2019 01:55:05: 27000000 INFO @ Mon, 03 Jun 2019 01:55:12: 29000000 INFO @ Mon, 03 Jun 2019 01:55:12: 28000000 INFO @ Mon, 03 Jun 2019 01:55:13: 28000000 INFO @ Mon, 03 Jun 2019 01:55:19: 30000000 INFO @ Mon, 03 Jun 2019 01:55:20: 29000000 INFO @ Mon, 03 Jun 2019 01:55:21: 29000000 INFO @ Mon, 03 Jun 2019 01:55:26: 31000000 INFO @ Mon, 03 Jun 2019 01:55:28: 30000000 INFO @ Mon, 03 Jun 2019 01:55:29: 30000000 INFO @ Mon, 03 Jun 2019 01:55:33: 32000000 INFO @ Mon, 03 Jun 2019 01:55:35: 31000000 INFO @ Mon, 03 Jun 2019 01:55:37: 31000000 INFO @ Mon, 03 Jun 2019 01:55:40: 33000000 INFO @ Mon, 03 Jun 2019 01:55:43: 32000000 INFO @ Mon, 03 Jun 2019 01:55:45: 32000000 INFO @ Mon, 03 Jun 2019 01:55:47: 34000000 INFO @ Mon, 03 Jun 2019 01:55:51: 33000000 INFO @ Mon, 03 Jun 2019 01:55:53: 33000000 INFO @ Mon, 03 Jun 2019 01:55:54: 35000000 INFO @ Mon, 03 Jun 2019 01:55:59: 34000000 INFO @ Mon, 03 Jun 2019 01:56:01: 34000000 INFO @ Mon, 03 Jun 2019 01:56:02: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 01:56:02: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 01:56:02: #1 total tags in treatment: 35906235 INFO @ Mon, 03 Jun 2019 01:56:02: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 01:56:02: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 01:56:02: #1 tags after filtering in treatment: 35906235 INFO @ Mon, 03 Jun 2019 01:56:02: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 01:56:02: #1 finished! INFO @ Mon, 03 Jun 2019 01:56:02: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 01:56:02: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 01:56:05: #2 number of paired peaks: 9 WARNING @ Mon, 03 Jun 2019 01:56:05: Too few paired peaks (9) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 01:56:05: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 01:56:06: 35000000 INFO @ Mon, 03 Jun 2019 01:56:09: 35000000 INFO @ Mon, 03 Jun 2019 01:56:13: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 01:56:13: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 01:56:13: #1 total tags in treatment: 35906235 INFO @ Mon, 03 Jun 2019 01:56:13: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 01:56:13: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 01:56:14: #1 tags after filtering in treatment: 35906235 INFO @ Mon, 03 Jun 2019 01:56:14: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 01:56:14: #1 finished! INFO @ Mon, 03 Jun 2019 01:56:14: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 01:56:14: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 01:56:17: #2 number of paired peaks: 9 WARNING @ Mon, 03 Jun 2019 01:56:17: Too few paired peaks (9) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 01:56:17: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 01:56:17: #1 tag size is determined as 51 bps INFO @ Mon, 03 Jun 2019 01:56:17: #1 tag size = 51 INFO @ Mon, 03 Jun 2019 01:56:17: #1 total tags in treatment: 35906235 INFO @ Mon, 03 Jun 2019 01:56:17: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 01:56:17: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 01:56:18: #1 tags after filtering in treatment: 35906235 INFO @ Mon, 03 Jun 2019 01:56:18: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 01:56:18: #1 finished! INFO @ Mon, 03 Jun 2019 01:56:18: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 01:56:18: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 01:56:21: #2 number of paired peaks: 9 WARNING @ Mon, 03 Jun 2019 01:56:21: Too few paired peaks (9) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 01:56:21: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX1032397/SRX1032397.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。