Job ID = 1293882 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-02T18:02:09 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-02T18:02:09 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 96,713,808 reads read : 96,713,808 reads written : 96,713,808 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:25:36 96713808 reads; of these: 96713808 (100.00%) were unpaired; of these: 7219274 (7.46%) aligned 0 times 79862201 (82.58%) aligned exactly 1 time 9632333 (9.96%) aligned >1 times 92.54% overall alignment rate Time searching: 00:25:36 Overall time: 00:25:36 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 40 files... [bam_rmdupse_core] 45992551 / 89494534 = 0.5139 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 03:52:47: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 03:52:47: #1 read tag files... INFO @ Mon, 03 Jun 2019 03:52:47: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 03:52:47: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 03:52:47: #1 read tag files... INFO @ Mon, 03 Jun 2019 03:52:47: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 03:52:47: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 03:52:47: #1 read tag files... INFO @ Mon, 03 Jun 2019 03:52:47: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 03:52:56: 1000000 INFO @ Mon, 03 Jun 2019 03:52:56: 1000000 INFO @ Mon, 03 Jun 2019 03:52:56: 1000000 INFO @ Mon, 03 Jun 2019 03:53:05: 2000000 INFO @ Mon, 03 Jun 2019 03:53:05: 2000000 INFO @ Mon, 03 Jun 2019 03:53:06: 2000000 INFO @ Mon, 03 Jun 2019 03:53:14: 3000000 INFO @ Mon, 03 Jun 2019 03:53:14: 3000000 INFO @ Mon, 03 Jun 2019 03:53:14: 3000000 INFO @ Mon, 03 Jun 2019 03:53:23: 4000000 INFO @ Mon, 03 Jun 2019 03:53:23: 4000000 INFO @ Mon, 03 Jun 2019 03:53:25: 4000000 INFO @ Mon, 03 Jun 2019 03:53:32: 5000000 INFO @ Mon, 03 Jun 2019 03:53:32: 5000000 INFO @ Mon, 03 Jun 2019 03:53:34: 5000000 INFO @ Mon, 03 Jun 2019 03:53:40: 6000000 INFO @ Mon, 03 Jun 2019 03:53:40: 6000000 INFO @ Mon, 03 Jun 2019 03:53:42: 6000000 INFO @ Mon, 03 Jun 2019 03:53:49: 7000000 INFO @ Mon, 03 Jun 2019 03:53:49: 7000000 INFO @ Mon, 03 Jun 2019 03:53:51: 7000000 INFO @ Mon, 03 Jun 2019 03:53:58: 8000000 INFO @ Mon, 03 Jun 2019 03:53:58: 8000000 INFO @ Mon, 03 Jun 2019 03:53:59: 8000000 INFO @ Mon, 03 Jun 2019 03:54:07: 9000000 INFO @ Mon, 03 Jun 2019 03:54:07: 9000000 INFO @ Mon, 03 Jun 2019 03:54:08: 9000000 INFO @ Mon, 03 Jun 2019 03:54:16: 10000000 INFO @ Mon, 03 Jun 2019 03:54:16: 10000000 INFO @ Mon, 03 Jun 2019 03:54:17: 10000000 INFO @ Mon, 03 Jun 2019 03:54:24: 11000000 INFO @ Mon, 03 Jun 2019 03:54:24: 11000000 INFO @ Mon, 03 Jun 2019 03:54:25: 11000000 INFO @ Mon, 03 Jun 2019 03:54:33: 12000000 INFO @ Mon, 03 Jun 2019 03:54:33: 12000000 INFO @ Mon, 03 Jun 2019 03:54:33: 12000000 INFO @ Mon, 03 Jun 2019 03:54:42: 13000000 INFO @ Mon, 03 Jun 2019 03:54:42: 13000000 INFO @ Mon, 03 Jun 2019 03:54:42: 13000000 INFO @ Mon, 03 Jun 2019 03:54:50: 14000000 INFO @ Mon, 03 Jun 2019 03:54:50: 14000000 INFO @ Mon, 03 Jun 2019 03:54:50: 14000000 INFO @ Mon, 03 Jun 2019 03:54:58: 15000000 INFO @ Mon, 03 Jun 2019 03:54:58: 15000000 INFO @ Mon, 03 Jun 2019 03:54:59: 15000000 INFO @ Mon, 03 Jun 2019 03:55:07: 16000000 INFO @ Mon, 03 Jun 2019 03:55:07: 16000000 INFO @ Mon, 03 Jun 2019 03:55:07: 16000000 INFO @ Mon, 03 Jun 2019 03:55:15: 17000000 INFO @ Mon, 03 Jun 2019 03:55:15: 17000000 INFO @ Mon, 03 Jun 2019 03:55:15: 17000000 INFO @ Mon, 03 Jun 2019 03:55:23: 18000000 INFO @ Mon, 03 Jun 2019 03:55:24: 18000000 INFO @ Mon, 03 Jun 2019 03:55:24: 18000000 INFO @ Mon, 03 Jun 2019 03:55:32: 19000000 INFO @ Mon, 03 Jun 2019 03:55:32: 19000000 INFO @ Mon, 03 Jun 2019 03:55:32: 19000000 INFO @ Mon, 03 Jun 2019 03:55:40: 20000000 INFO @ Mon, 03 Jun 2019 03:55:40: 20000000 INFO @ Mon, 03 Jun 2019 03:55:41: 20000000 INFO @ Mon, 03 Jun 2019 03:55:50: 21000000 INFO @ Mon, 03 Jun 2019 03:55:50: 21000000 INFO @ Mon, 03 Jun 2019 03:55:50: 21000000 INFO @ Mon, 03 Jun 2019 03:55:59: 22000000 INFO @ Mon, 03 Jun 2019 03:56:00: 22000000 INFO @ Mon, 03 Jun 2019 03:56:00: 22000000 INFO @ Mon, 03 Jun 2019 03:56:08: 23000000 INFO @ Mon, 03 Jun 2019 03:56:09: 23000000 INFO @ Mon, 03 Jun 2019 03:56:09: 23000000 INFO @ Mon, 03 Jun 2019 03:56:17: 24000000 INFO @ Mon, 03 Jun 2019 03:56:18: 24000000 INFO @ Mon, 03 Jun 2019 03:56:18: 24000000 INFO @ Mon, 03 Jun 2019 03:56:26: 25000000 INFO @ Mon, 03 Jun 2019 03:56:27: 25000000 INFO @ Mon, 03 Jun 2019 03:56:28: 25000000 INFO @ Mon, 03 Jun 2019 03:56:35: 26000000 INFO @ Mon, 03 Jun 2019 03:56:37: 26000000 INFO @ Mon, 03 Jun 2019 03:56:37: 26000000 INFO @ Mon, 03 Jun 2019 03:56:44: 27000000 INFO @ Mon, 03 Jun 2019 03:56:46: 27000000 INFO @ Mon, 03 Jun 2019 03:56:46: 27000000 INFO @ Mon, 03 Jun 2019 03:56:53: 28000000 INFO @ Mon, 03 Jun 2019 03:56:55: 28000000 INFO @ Mon, 03 Jun 2019 03:56:55: 28000000 INFO @ Mon, 03 Jun 2019 03:57:02: 29000000 INFO @ Mon, 03 Jun 2019 03:57:04: 29000000 INFO @ Mon, 03 Jun 2019 03:57:04: 29000000 INFO @ Mon, 03 Jun 2019 03:57:11: 30000000 INFO @ Mon, 03 Jun 2019 03:57:13: 30000000 INFO @ Mon, 03 Jun 2019 03:57:13: 30000000 INFO @ Mon, 03 Jun 2019 03:57:20: 31000000 INFO @ Mon, 03 Jun 2019 03:57:22: 31000000 INFO @ Mon, 03 Jun 2019 03:57:22: 31000000 INFO @ Mon, 03 Jun 2019 03:57:29: 32000000 INFO @ Mon, 03 Jun 2019 03:57:31: 32000000 INFO @ Mon, 03 Jun 2019 03:57:31: 32000000 INFO @ Mon, 03 Jun 2019 03:57:37: 33000000 INFO @ Mon, 03 Jun 2019 03:57:40: 33000000 INFO @ Mon, 03 Jun 2019 03:57:40: 33000000 INFO @ Mon, 03 Jun 2019 03:57:46: 34000000 INFO @ Mon, 03 Jun 2019 03:57:49: 34000000 INFO @ Mon, 03 Jun 2019 03:57:49: 34000000 INFO @ Mon, 03 Jun 2019 03:57:56: 35000000 INFO @ Mon, 03 Jun 2019 03:57:57: 35000000 INFO @ Mon, 03 Jun 2019 03:57:58: 35000000 INFO @ Mon, 03 Jun 2019 03:58:05: 36000000 INFO @ Mon, 03 Jun 2019 03:58:07: 36000000 INFO @ Mon, 03 Jun 2019 03:58:07: 36000000 INFO @ Mon, 03 Jun 2019 03:58:14: 37000000 INFO @ Mon, 03 Jun 2019 03:58:16: 37000000 INFO @ Mon, 03 Jun 2019 03:58:16: 37000000 INFO @ Mon, 03 Jun 2019 03:58:24: 38000000 INFO @ Mon, 03 Jun 2019 03:58:25: 38000000 INFO @ Mon, 03 Jun 2019 03:58:26: 38000000 INFO @ Mon, 03 Jun 2019 03:58:33: 39000000 INFO @ Mon, 03 Jun 2019 03:58:33: 39000000 INFO @ Mon, 03 Jun 2019 03:58:35: 39000000 INFO @ Mon, 03 Jun 2019 03:58:42: 40000000 INFO @ Mon, 03 Jun 2019 03:58:42: 40000000 INFO @ Mon, 03 Jun 2019 03:58:44: 40000000 INFO @ Mon, 03 Jun 2019 03:58:50: 41000000 INFO @ Mon, 03 Jun 2019 03:58:51: 41000000 INFO @ Mon, 03 Jun 2019 03:58:52: 41000000 INFO @ Mon, 03 Jun 2019 03:58:59: 42000000 INFO @ Mon, 03 Jun 2019 03:59:00: 42000000 INFO @ Mon, 03 Jun 2019 03:59:01: 42000000 INFO @ Mon, 03 Jun 2019 03:59:08: 43000000 INFO @ Mon, 03 Jun 2019 03:59:08: 43000000 INFO @ Mon, 03 Jun 2019 03:59:09: 43000000 INFO @ Mon, 03 Jun 2019 03:59:13: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 03:59:13: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 03:59:13: #1 total tags in treatment: 43501983 INFO @ Mon, 03 Jun 2019 03:59:13: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 03:59:13: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 03:59:14: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 03:59:14: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 03:59:14: #1 total tags in treatment: 43501983 INFO @ Mon, 03 Jun 2019 03:59:14: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 03:59:14: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 03:59:14: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 03:59:14: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 03:59:14: #1 total tags in treatment: 43501983 INFO @ Mon, 03 Jun 2019 03:59:14: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 03:59:14: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 03:59:14: #1 tags after filtering in treatment: 43501983 INFO @ Mon, 03 Jun 2019 03:59:14: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 03:59:14: #1 finished! INFO @ Mon, 03 Jun 2019 03:59:14: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 03:59:14: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 03:59:14: #1 tags after filtering in treatment: 43501983 INFO @ Mon, 03 Jun 2019 03:59:14: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 03:59:14: #1 finished! INFO @ Mon, 03 Jun 2019 03:59:14: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 03:59:14: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 03:59:15: #1 tags after filtering in treatment: 43501983 INFO @ Mon, 03 Jun 2019 03:59:15: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 03:59:15: #1 finished! INFO @ Mon, 03 Jun 2019 03:59:15: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 03:59:15: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 03:59:18: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 03:59:18: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 03:59:18: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 03:59:18: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 03:59:18: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 03:59:18: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 03:59:18: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 03:59:18: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 03:59:18: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX135525/SRX135525.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。