Job ID = 1295322 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-06-03T04:24:11 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T04:24:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T04:24:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T04:24:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T04:24:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-06-03T04:27:14 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 45,697,542 reads read : 45,697,542 reads written : 45,697,542 rm: cannot remove ‘[DSE]RR*’: No such file or directory rm: cannot remove ‘fastqDump_tmp*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:14:21 45697542 reads; of these: 45697542 (100.00%) were unpaired; of these: 2069583 (4.53%) aligned 0 times 35183191 (76.99%) aligned exactly 1 time 8444768 (18.48%) aligned >1 times 95.47% overall alignment rate Time searching: 00:14:21 Overall time: 00:14:21 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 20 files... [bam_rmdupse_core] 12977952 / 43627959 = 0.2975 in library ' ' BAM に変換しました。 Bed ファイルを作成中... BedGraph に変換中... INFO @ Mon, 03 Jun 2019 14:00:06: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 14:00:06: #1 read tag files... INFO @ Mon, 03 Jun 2019 14:00:06: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 14:00:06: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 14:00:06: #1 read tag files... INFO @ Mon, 03 Jun 2019 14:00:06: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 14:00:07: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Mon, 03 Jun 2019 14:00:07: #1 read tag files... INFO @ Mon, 03 Jun 2019 14:00:07: #1 read treatment tags... INFO @ Mon, 03 Jun 2019 14:00:14: 1000000 INFO @ Mon, 03 Jun 2019 14:00:14: 1000000 INFO @ Mon, 03 Jun 2019 14:00:15: 1000000 INFO @ Mon, 03 Jun 2019 14:00:22: 2000000 INFO @ Mon, 03 Jun 2019 14:00:22: 2000000 INFO @ Mon, 03 Jun 2019 14:00:24: 2000000 INFO @ Mon, 03 Jun 2019 14:00:30: 3000000 INFO @ Mon, 03 Jun 2019 14:00:30: 3000000 INFO @ Mon, 03 Jun 2019 14:00:32: 3000000 INFO @ Mon, 03 Jun 2019 14:00:37: 4000000 INFO @ Mon, 03 Jun 2019 14:00:37: 4000000 INFO @ Mon, 03 Jun 2019 14:00:40: 4000000 INFO @ Mon, 03 Jun 2019 14:00:45: 5000000 INFO @ Mon, 03 Jun 2019 14:00:45: 5000000 INFO @ Mon, 03 Jun 2019 14:00:49: 5000000 INFO @ Mon, 03 Jun 2019 14:00:53: 6000000 INFO @ Mon, 03 Jun 2019 14:00:53: 6000000 INFO @ Mon, 03 Jun 2019 14:00:57: 6000000 INFO @ Mon, 03 Jun 2019 14:01:00: 7000000 INFO @ Mon, 03 Jun 2019 14:01:01: 7000000 INFO @ Mon, 03 Jun 2019 14:01:05: 7000000 INFO @ Mon, 03 Jun 2019 14:01:08: 8000000 INFO @ Mon, 03 Jun 2019 14:01:08: 8000000 INFO @ Mon, 03 Jun 2019 14:01:13: 8000000 INFO @ Mon, 03 Jun 2019 14:01:15: 9000000 INFO @ Mon, 03 Jun 2019 14:01:15: 9000000 INFO @ Mon, 03 Jun 2019 14:01:21: 9000000 INFO @ Mon, 03 Jun 2019 14:01:22: 10000000 INFO @ Mon, 03 Jun 2019 14:01:22: 10000000 INFO @ Mon, 03 Jun 2019 14:01:29: 10000000 INFO @ Mon, 03 Jun 2019 14:01:29: 11000000 INFO @ Mon, 03 Jun 2019 14:01:30: 11000000 INFO @ Mon, 03 Jun 2019 14:01:37: 12000000 INFO @ Mon, 03 Jun 2019 14:01:37: 12000000 INFO @ Mon, 03 Jun 2019 14:01:37: 11000000 INFO @ Mon, 03 Jun 2019 14:01:44: 13000000 INFO @ Mon, 03 Jun 2019 14:01:44: 13000000 INFO @ Mon, 03 Jun 2019 14:01:45: 12000000 INFO @ Mon, 03 Jun 2019 14:01:51: 14000000 INFO @ Mon, 03 Jun 2019 14:01:51: 14000000 INFO @ Mon, 03 Jun 2019 14:01:53: 13000000 INFO @ Mon, 03 Jun 2019 14:01:58: 15000000 INFO @ Mon, 03 Jun 2019 14:01:58: 15000000 INFO @ Mon, 03 Jun 2019 14:02:01: 14000000 INFO @ Mon, 03 Jun 2019 14:02:05: 16000000 INFO @ Mon, 03 Jun 2019 14:02:06: 16000000 INFO @ Mon, 03 Jun 2019 14:02:09: 15000000 INFO @ Mon, 03 Jun 2019 14:02:13: 17000000 INFO @ Mon, 03 Jun 2019 14:02:13: 17000000 INFO @ Mon, 03 Jun 2019 14:02:17: 16000000 INFO @ Mon, 03 Jun 2019 14:02:20: 18000000 INFO @ Mon, 03 Jun 2019 14:02:20: 18000000 INFO @ Mon, 03 Jun 2019 14:02:25: 17000000 INFO @ Mon, 03 Jun 2019 14:02:27: 19000000 INFO @ Mon, 03 Jun 2019 14:02:27: 19000000 INFO @ Mon, 03 Jun 2019 14:02:33: 18000000 INFO @ Mon, 03 Jun 2019 14:02:34: 20000000 INFO @ Mon, 03 Jun 2019 14:02:34: 20000000 INFO @ Mon, 03 Jun 2019 14:02:41: 19000000 INFO @ Mon, 03 Jun 2019 14:02:42: 21000000 INFO @ Mon, 03 Jun 2019 14:02:42: 21000000 INFO @ Mon, 03 Jun 2019 14:02:49: 22000000 INFO @ Mon, 03 Jun 2019 14:02:49: 22000000 INFO @ Mon, 03 Jun 2019 14:02:49: 20000000 INFO @ Mon, 03 Jun 2019 14:02:56: 23000000 INFO @ Mon, 03 Jun 2019 14:02:56: 23000000 INFO @ Mon, 03 Jun 2019 14:02:57: 21000000 INFO @ Mon, 03 Jun 2019 14:03:03: 24000000 INFO @ Mon, 03 Jun 2019 14:03:03: 24000000 INFO @ Mon, 03 Jun 2019 14:03:05: 22000000 INFO @ Mon, 03 Jun 2019 14:03:11: 25000000 INFO @ Mon, 03 Jun 2019 14:03:11: 25000000 INFO @ Mon, 03 Jun 2019 14:03:13: 23000000 INFO @ Mon, 03 Jun 2019 14:03:18: 26000000 INFO @ Mon, 03 Jun 2019 14:03:18: 26000000 INFO @ Mon, 03 Jun 2019 14:03:21: 24000000 INFO @ Mon, 03 Jun 2019 14:03:25: 27000000 INFO @ Mon, 03 Jun 2019 14:03:25: 27000000 INFO @ Mon, 03 Jun 2019 14:03:29: 25000000 INFO @ Mon, 03 Jun 2019 14:03:33: 28000000 INFO @ Mon, 03 Jun 2019 14:03:33: 28000000 INFO @ Mon, 03 Jun 2019 14:03:37: 26000000 INFO @ Mon, 03 Jun 2019 14:03:40: 29000000 INFO @ Mon, 03 Jun 2019 14:03:40: 29000000 INFO @ Mon, 03 Jun 2019 14:03:45: 27000000 INFO @ Mon, 03 Jun 2019 14:03:47: 30000000 INFO @ Mon, 03 Jun 2019 14:03:47: 30000000 INFO @ Mon, 03 Jun 2019 14:03:52: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 14:03:52: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 14:03:52: #1 total tags in treatment: 30650007 INFO @ Mon, 03 Jun 2019 14:03:52: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 14:03:52: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 14:03:52: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 14:03:52: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 14:03:52: #1 total tags in treatment: 30650007 INFO @ Mon, 03 Jun 2019 14:03:52: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 14:03:52: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 14:03:52: #1 tags after filtering in treatment: 30650007 INFO @ Mon, 03 Jun 2019 14:03:52: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 14:03:52: #1 finished! INFO @ Mon, 03 Jun 2019 14:03:52: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 14:03:52: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 14:03:52: #1 tags after filtering in treatment: 30650007 INFO @ Mon, 03 Jun 2019 14:03:52: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 14:03:52: #1 finished! INFO @ Mon, 03 Jun 2019 14:03:52: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 14:03:52: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 14:03:53: 28000000 INFO @ Mon, 03 Jun 2019 14:03:55: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 14:03:55: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 14:03:55: Process for pairing-model is terminated! INFO @ Mon, 03 Jun 2019 14:03:55: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 14:03:55: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 14:03:55: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.20_peaks.narrowPeak: No such file or directory cut: /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) pass1 - making usageList (0 chroms): 1 millis rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.20_*.xls’: No such file or directory needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Mon, 03 Jun 2019 14:04:00: 29000000 INFO @ Mon, 03 Jun 2019 14:04:08: 30000000 INFO @ Mon, 03 Jun 2019 14:04:13: #1 tag size is determined as 50 bps INFO @ Mon, 03 Jun 2019 14:04:13: #1 tag size = 50 INFO @ Mon, 03 Jun 2019 14:04:13: #1 total tags in treatment: 30650007 INFO @ Mon, 03 Jun 2019 14:04:13: #1 user defined the maximum tags... INFO @ Mon, 03 Jun 2019 14:04:13: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Mon, 03 Jun 2019 14:04:14: #1 tags after filtering in treatment: 30650007 INFO @ Mon, 03 Jun 2019 14:04:14: #1 Redundant rate of treatment: 0.00 INFO @ Mon, 03 Jun 2019 14:04:14: #1 finished! INFO @ Mon, 03 Jun 2019 14:04:14: #2 Build Peak Model... INFO @ Mon, 03 Jun 2019 14:04:14: #2 looking for paired plus/minus strand peaks... INFO @ Mon, 03 Jun 2019 14:04:17: #2 number of paired peaks: 0 WARNING @ Mon, 03 Jun 2019 14:04:17: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Mon, 03 Jun 2019 14:04:17: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX335490/SRX335490.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。