Job ID = 3785779 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-11-01T05:28:48 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:29:45 fasterq-dump.2.9.6 sys: error unknown while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:30:10 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:31:34 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:31:34 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:34:05 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:34:05 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:35:54 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:36:28 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:36:28 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:37:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:39:46 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:44:42 fasterq-dump.2.9.6 sys: error unknown while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:45:14 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:46:53 fasterq-dump.2.9.6 sys: error unknown while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:48:25 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:49:57 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:52:45 fasterq-dump.2.9.6 sys: error unknown while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:53:33 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 40,503,524 reads read : 81,007,048 reads written : 40,503,524 reads 0-length : 40,503,524 2019-11-01T05:56:17 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:56:18 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:57:08 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:57:08 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:58:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:58:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:58:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:58:12 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:59:02 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:59:02 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:59:15 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:59:15 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:59:15 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T05:59:54 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T06:03:36 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-11-01T06:10:49 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 52,500,853 reads read : 105,001,706 reads written : 52,500,853 reads 0-length : 52,500,853 fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:01 Time loading mirror index: 00:00:00 Multiseed full-index search: 00:18:57 93004377 reads; of these: 93004377 (100.00%) were unpaired; of these: 29426469 (31.64%) aligned 0 times 52545138 (56.50%) aligned exactly 1 time 11032770 (11.86%) aligned >1 times 68.36% overall alignment rate Time searching: 00:18:58 Overall time: 00:18:58 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 7 sequences. [bam_sort_core] merging from 28 files... [bam_rmdupse_core] 24411947 / 63577908 = 0.3840 in library ' ' BAM に変換しました。 Bed ファイルを作成中... INFO @ Fri, 01 Nov 2019 15:53:21: # Command line: callpeak -t /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.bam -f BAM -g ce -n /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.bam'] # control file = None # effective genome size = 9.00e+07 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Fri, 01 Nov 2019 15:53:21: #1 read tag files... INFO @ Fri, 01 Nov 2019 15:53:21: #1 read treatment tags... INFO @ Fri, 01 Nov 2019 15:53:28: 1000000 INFO @ Fri, 01 Nov 2019 15:53:35: 2000000 INFO @ Fri, 01 Nov 2019 15:53:42: 3000000 INFO @ Fri, 01 Nov 2019 15:53:49: 4000000 INFO @ Fri, 01 Nov 2019 15:53:51: # Command line: callpeak -t /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.bam -f BAM -g ce -n /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.bam'] # control file = None # effective genome size = 9.00e+07 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Fri, 01 Nov 2019 15:53:51: #1 read tag files... INFO @ Fri, 01 Nov 2019 15:53:51: #1 read treatment tags... INFO @ Fri, 01 Nov 2019 15:53:56: 5000000 INFO @ Fri, 01 Nov 2019 15:53:59: 1000000 INFO @ Fri, 01 Nov 2019 15:54:03: 6000000 INFO @ Fri, 01 Nov 2019 15:54:07: 2000000 INFO @ Fri, 01 Nov 2019 15:54:10: 7000000 INFO @ Fri, 01 Nov 2019 15:54:15: 3000000 INFO @ Fri, 01 Nov 2019 15:54:17: 8000000 BedGraph に変換中... INFO @ Fri, 01 Nov 2019 15:54:21: # Command line: callpeak -t /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.bam -f BAM -g ce -n /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.bam'] # control file = None # effective genome size = 9.00e+07 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Fri, 01 Nov 2019 15:54:21: #1 read tag files... INFO @ Fri, 01 Nov 2019 15:54:21: #1 read treatment tags... INFO @ Fri, 01 Nov 2019 15:54:23: 4000000 INFO @ Fri, 01 Nov 2019 15:54:24: 9000000 INFO @ Fri, 01 Nov 2019 15:54:29: 1000000 INFO @ Fri, 01 Nov 2019 15:54:31: 5000000 INFO @ Fri, 01 Nov 2019 15:54:31: 10000000 INFO @ Fri, 01 Nov 2019 15:54:37: 2000000 INFO @ Fri, 01 Nov 2019 15:54:39: 11000000 INFO @ Fri, 01 Nov 2019 15:54:39: 6000000 INFO @ Fri, 01 Nov 2019 15:54:45: 3000000 INFO @ Fri, 01 Nov 2019 15:54:46: 12000000 INFO @ Fri, 01 Nov 2019 15:54:47: 7000000 INFO @ Fri, 01 Nov 2019 15:54:53: 13000000 INFO @ Fri, 01 Nov 2019 15:54:53: 4000000 INFO @ Fri, 01 Nov 2019 15:54:55: 8000000 INFO @ Fri, 01 Nov 2019 15:55:00: 14000000 INFO @ Fri, 01 Nov 2019 15:55:01: 5000000 INFO @ Fri, 01 Nov 2019 15:55:03: 9000000 INFO @ Fri, 01 Nov 2019 15:55:08: 15000000 INFO @ Fri, 01 Nov 2019 15:55:09: 6000000 INFO @ Fri, 01 Nov 2019 15:55:11: 10000000 INFO @ Fri, 01 Nov 2019 15:55:15: 16000000 INFO @ Fri, 01 Nov 2019 15:55:17: 7000000 INFO @ Fri, 01 Nov 2019 15:55:19: 11000000 INFO @ Fri, 01 Nov 2019 15:55:22: 17000000 INFO @ Fri, 01 Nov 2019 15:55:25: 8000000 INFO @ Fri, 01 Nov 2019 15:55:27: 12000000 INFO @ Fri, 01 Nov 2019 15:55:29: 18000000 INFO @ Fri, 01 Nov 2019 15:55:33: 9000000 INFO @ Fri, 01 Nov 2019 15:55:35: 13000000 INFO @ Fri, 01 Nov 2019 15:55:36: 19000000 INFO @ Fri, 01 Nov 2019 15:55:41: 10000000 INFO @ Fri, 01 Nov 2019 15:55:43: 14000000 INFO @ Fri, 01 Nov 2019 15:55:44: 20000000 INFO @ Fri, 01 Nov 2019 15:55:49: 11000000 INFO @ Fri, 01 Nov 2019 15:55:51: 21000000 INFO @ Fri, 01 Nov 2019 15:55:51: 15000000 INFO @ Fri, 01 Nov 2019 15:55:57: 12000000 INFO @ Fri, 01 Nov 2019 15:55:58: 22000000 INFO @ Fri, 01 Nov 2019 15:55:59: 16000000 INFO @ Fri, 01 Nov 2019 15:56:05: 13000000 INFO @ Fri, 01 Nov 2019 15:56:05: 23000000 INFO @ Fri, 01 Nov 2019 15:56:08: 17000000 INFO @ Fri, 01 Nov 2019 15:56:12: 24000000 INFO @ Fri, 01 Nov 2019 15:56:13: 14000000 INFO @ Fri, 01 Nov 2019 15:56:16: 18000000 INFO @ Fri, 01 Nov 2019 15:56:20: 25000000 INFO @ Fri, 01 Nov 2019 15:56:21: 15000000 INFO @ Fri, 01 Nov 2019 15:56:24: 19000000 INFO @ Fri, 01 Nov 2019 15:56:27: 26000000 INFO @ Fri, 01 Nov 2019 15:56:29: 16000000 INFO @ Fri, 01 Nov 2019 15:56:32: 20000000 INFO @ Fri, 01 Nov 2019 15:56:35: 27000000 INFO @ Fri, 01 Nov 2019 15:56:37: 17000000 INFO @ Fri, 01 Nov 2019 15:56:40: 21000000 INFO @ Fri, 01 Nov 2019 15:56:42: 28000000 INFO @ Fri, 01 Nov 2019 15:56:45: 18000000 INFO @ Fri, 01 Nov 2019 15:56:48: 22000000 INFO @ Fri, 01 Nov 2019 15:56:50: 29000000 INFO @ Fri, 01 Nov 2019 15:56:53: 19000000 INFO @ Fri, 01 Nov 2019 15:56:56: 23000000 INFO @ Fri, 01 Nov 2019 15:56:57: 30000000 INFO @ Fri, 01 Nov 2019 15:57:01: 20000000 INFO @ Fri, 01 Nov 2019 15:57:04: 24000000 INFO @ Fri, 01 Nov 2019 15:57:05: 31000000 INFO @ Fri, 01 Nov 2019 15:57:09: 21000000 INFO @ Fri, 01 Nov 2019 15:57:12: 25000000 INFO @ Fri, 01 Nov 2019 15:57:12: 32000000 INFO @ Fri, 01 Nov 2019 15:57:16: 22000000 INFO @ Fri, 01 Nov 2019 15:57:19: 33000000 INFO @ Fri, 01 Nov 2019 15:57:20: 26000000 INFO @ Fri, 01 Nov 2019 15:57:24: 23000000 INFO @ Fri, 01 Nov 2019 15:57:27: 34000000 INFO @ Fri, 01 Nov 2019 15:57:29: 27000000 INFO @ Fri, 01 Nov 2019 15:57:32: 24000000 INFO @ Fri, 01 Nov 2019 15:57:34: 35000000 INFO @ Fri, 01 Nov 2019 15:57:37: 28000000 INFO @ Fri, 01 Nov 2019 15:57:40: 25000000 INFO @ Fri, 01 Nov 2019 15:57:41: 36000000 INFO @ Fri, 01 Nov 2019 15:57:46: 29000000 INFO @ Fri, 01 Nov 2019 15:57:49: 37000000 INFO @ Fri, 01 Nov 2019 15:57:49: 26000000 INFO @ Fri, 01 Nov 2019 15:57:54: 30000000 INFO @ Fri, 01 Nov 2019 15:57:56: 38000000 INFO @ Fri, 01 Nov 2019 15:57:57: 27000000 INFO @ Fri, 01 Nov 2019 15:58:03: 31000000 INFO @ Fri, 01 Nov 2019 15:58:03: 39000000 INFO @ Fri, 01 Nov 2019 15:58:04: #1 tag size is determined as 51 bps INFO @ Fri, 01 Nov 2019 15:58:04: #1 tag size = 51 INFO @ Fri, 01 Nov 2019 15:58:04: #1 total tags in treatment: 39165961 INFO @ Fri, 01 Nov 2019 15:58:04: #1 user defined the maximum tags... INFO @ Fri, 01 Nov 2019 15:58:04: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Fri, 01 Nov 2019 15:58:05: #1 tags after filtering in treatment: 39165961 INFO @ Fri, 01 Nov 2019 15:58:05: #1 Redundant rate of treatment: 0.00 INFO @ Fri, 01 Nov 2019 15:58:05: #1 finished! INFO @ Fri, 01 Nov 2019 15:58:05: #2 Build Peak Model... INFO @ Fri, 01 Nov 2019 15:58:05: #2 looking for paired plus/minus strand peaks... INFO @ Fri, 01 Nov 2019 15:58:06: 28000000 INFO @ Fri, 01 Nov 2019 15:58:08: #2 number of paired peaks: 0 WARNING @ Fri, 01 Nov 2019 15:58:08: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Fri, 01 Nov 2019 15:58:08: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Fri, 01 Nov 2019 15:58:11: 32000000 INFO @ Fri, 01 Nov 2019 15:58:14: 29000000 INFO @ Fri, 01 Nov 2019 15:58:20: 33000000 INFO @ Fri, 01 Nov 2019 15:58:23: 30000000 INFO @ Fri, 01 Nov 2019 15:58:28: 34000000 INFO @ Fri, 01 Nov 2019 15:58:31: 31000000 INFO @ Fri, 01 Nov 2019 15:58:36: 35000000 INFO @ Fri, 01 Nov 2019 15:58:39: 32000000 INFO @ Fri, 01 Nov 2019 15:58:44: 36000000 INFO @ Fri, 01 Nov 2019 15:58:48: 33000000 INFO @ Fri, 01 Nov 2019 15:58:52: 37000000 INFO @ Fri, 01 Nov 2019 15:58:56: 34000000 INFO @ Fri, 01 Nov 2019 15:59:00: 38000000 INFO @ Fri, 01 Nov 2019 15:59:04: 35000000 INFO @ Fri, 01 Nov 2019 15:59:08: 39000000 INFO @ Fri, 01 Nov 2019 15:59:10: #1 tag size is determined as 51 bps INFO @ Fri, 01 Nov 2019 15:59:10: #1 tag size = 51 INFO @ Fri, 01 Nov 2019 15:59:10: #1 total tags in treatment: 39165961 INFO @ Fri, 01 Nov 2019 15:59:10: #1 user defined the maximum tags... INFO @ Fri, 01 Nov 2019 15:59:10: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Fri, 01 Nov 2019 15:59:10: #1 tags after filtering in treatment: 39165961 INFO @ Fri, 01 Nov 2019 15:59:10: #1 Redundant rate of treatment: 0.00 INFO @ Fri, 01 Nov 2019 15:59:10: #1 finished! INFO @ Fri, 01 Nov 2019 15:59:10: #2 Build Peak Model... INFO @ Fri, 01 Nov 2019 15:59:10: #2 looking for paired plus/minus strand peaks... INFO @ Fri, 01 Nov 2019 15:59:12: 36000000 INFO @ Fri, 01 Nov 2019 15:59:13: #2 number of paired peaks: 0 WARNING @ Fri, 01 Nov 2019 15:59:13: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Fri, 01 Nov 2019 15:59:13: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Fri, 01 Nov 2019 15:59:19: 37000000 INFO @ Fri, 01 Nov 2019 15:59:27: 38000000 INFO @ Fri, 01 Nov 2019 15:59:35: 39000000 INFO @ Fri, 01 Nov 2019 15:59:36: #1 tag size is determined as 51 bps INFO @ Fri, 01 Nov 2019 15:59:36: #1 tag size = 51 INFO @ Fri, 01 Nov 2019 15:59:36: #1 total tags in treatment: 39165961 INFO @ Fri, 01 Nov 2019 15:59:36: #1 user defined the maximum tags... INFO @ Fri, 01 Nov 2019 15:59:36: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Fri, 01 Nov 2019 15:59:37: #1 tags after filtering in treatment: 39165961 INFO @ Fri, 01 Nov 2019 15:59:37: #1 Redundant rate of treatment: 0.00 INFO @ Fri, 01 Nov 2019 15:59:37: #1 finished! INFO @ Fri, 01 Nov 2019 15:59:37: #2 Build Peak Model... INFO @ Fri, 01 Nov 2019 15:59:37: #2 looking for paired plus/minus strand peaks... INFO @ Fri, 01 Nov 2019 15:59:40: #2 number of paired peaks: 0 WARNING @ Fri, 01 Nov 2019 15:59:40: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Fri, 01 Nov 2019 15:59:40: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/ce10/SRX5545241/SRX5545241.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。