Job ID = 4178642 sra ファイルのダウンロード中... Read layout: SINGLE fastq に変換中... 2019-12-05T04:31:02 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-12-05T04:40:29 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-12-05T04:47:11 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) 2019-12-05T04:49:17 fasterq-dump.2.9.6 sys: timeout exhausted while reading file within network system module - mbedtls_ssl_read returned -76 ( NET - Reading information from the socket failed ) spots read : 78,708,466 reads read : 157,416,932 reads written : 78,708,466 reads 0-length : 78,708,466 rm: cannot remove ‘[DSE]RR*’: No such file or directory fastq に変換しました。 bowtie でマッピング中... Time loading reference: 00:00:00 Time loading forward index: 00:00:00 Time loading mirror index: 00:00:01 Multiseed full-index search: 00:41:42 78708466 reads; of these: 78708466 (100.00%) were unpaired; of these: 1898013 (2.41%) aligned 0 times 61013380 (77.52%) aligned exactly 1 time 15797073 (20.07%) aligned >1 times 97.59% overall alignment rate Time searching: 00:41:43 Overall time: 00:41:43 マッピングが完了しました。 samtools でBAM に変換中... [samopen] SAM header is present: 15 sequences. [bam_sort_core] merging from 32 files... [bam_rmdupse_core] 18545282 / 76810453 = 0.2414 in library ' ' BAM に変換しました。 Bed ファイルを作成中... INFO @ Thu, 05 Dec 2019 15:07:58: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.05 -q 1e-05 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.05 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-05 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Thu, 05 Dec 2019 15:07:58: #1 read tag files... INFO @ Thu, 05 Dec 2019 15:07:58: #1 read treatment tags... INFO @ Thu, 05 Dec 2019 15:08:08: 1000000 INFO @ Thu, 05 Dec 2019 15:08:27: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.10 -q 1e-10 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.10 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-10 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Thu, 05 Dec 2019 15:08:27: #1 read tag files... INFO @ Thu, 05 Dec 2019 15:08:27: #1 read treatment tags... INFO @ Thu, 05 Dec 2019 15:08:30: 2000000 INFO @ Thu, 05 Dec 2019 15:08:35: 1000000 INFO @ Thu, 05 Dec 2019 15:08:41: 3000000 INFO @ Thu, 05 Dec 2019 15:08:43: 2000000 INFO @ Thu, 05 Dec 2019 15:08:52: 3000000 INFO @ Thu, 05 Dec 2019 15:08:53: 4000000 BedGraph に変換中... INFO @ Thu, 05 Dec 2019 15:08:57: # Command line: callpeak -t /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.bam -f BAM -g dm -n /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.20 -q 1e-20 # ARGUMENTS LIST: # name = /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.20 # format = BAM # ChIP-seq file = ['/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.bam'] # control file = None # effective genome size = 1.20e+08 # band width = 300 # model fold = [5, 50] # qvalue cutoff = 1.00e-20 # Larger dataset will be scaled towards smaller dataset. # Range for calculating regional lambda is: 10000 bps # Broad region calling is off # Paired-End mode is off INFO @ Thu, 05 Dec 2019 15:08:57: #1 read tag files... INFO @ Thu, 05 Dec 2019 15:08:57: #1 read treatment tags... INFO @ Thu, 05 Dec 2019 15:09:00: 4000000 INFO @ Thu, 05 Dec 2019 15:09:04: 1000000 INFO @ Thu, 05 Dec 2019 15:09:04: 5000000 INFO @ Thu, 05 Dec 2019 15:09:08: 5000000 INFO @ Thu, 05 Dec 2019 15:09:11: 2000000 INFO @ Thu, 05 Dec 2019 15:09:16: 6000000 INFO @ Thu, 05 Dec 2019 15:09:17: 6000000 INFO @ Thu, 05 Dec 2019 15:09:19: 3000000 INFO @ Thu, 05 Dec 2019 15:09:26: 7000000 INFO @ Thu, 05 Dec 2019 15:09:27: 4000000 INFO @ Thu, 05 Dec 2019 15:09:27: 7000000 INFO @ Thu, 05 Dec 2019 15:09:34: 5000000 INFO @ Thu, 05 Dec 2019 15:09:35: 8000000 INFO @ Thu, 05 Dec 2019 15:09:37: 8000000 INFO @ Thu, 05 Dec 2019 15:09:41: 6000000 INFO @ Thu, 05 Dec 2019 15:09:43: 9000000 INFO @ Thu, 05 Dec 2019 15:09:46: 9000000 INFO @ Thu, 05 Dec 2019 15:09:49: 7000000 INFO @ Thu, 05 Dec 2019 15:09:51: 10000000 INFO @ Thu, 05 Dec 2019 15:09:56: 8000000 INFO @ Thu, 05 Dec 2019 15:09:56: 10000000 INFO @ Thu, 05 Dec 2019 15:09:58: 11000000 INFO @ Thu, 05 Dec 2019 15:10:03: 9000000 INFO @ Thu, 05 Dec 2019 15:10:05: 11000000 INFO @ Thu, 05 Dec 2019 15:10:06: 12000000 INFO @ Thu, 05 Dec 2019 15:10:10: 10000000 INFO @ Thu, 05 Dec 2019 15:10:14: 13000000 INFO @ Thu, 05 Dec 2019 15:10:15: 12000000 INFO @ Thu, 05 Dec 2019 15:10:17: 11000000 INFO @ Thu, 05 Dec 2019 15:10:22: 14000000 INFO @ Thu, 05 Dec 2019 15:10:24: 13000000 INFO @ Thu, 05 Dec 2019 15:10:24: 12000000 INFO @ Thu, 05 Dec 2019 15:10:30: 15000000 INFO @ Thu, 05 Dec 2019 15:10:32: 13000000 INFO @ Thu, 05 Dec 2019 15:10:34: 14000000 INFO @ Thu, 05 Dec 2019 15:10:38: 16000000 INFO @ Thu, 05 Dec 2019 15:10:39: 14000000 INFO @ Thu, 05 Dec 2019 15:10:43: 15000000 INFO @ Thu, 05 Dec 2019 15:10:46: 17000000 INFO @ Thu, 05 Dec 2019 15:10:46: 15000000 INFO @ Thu, 05 Dec 2019 15:10:53: 16000000 INFO @ Thu, 05 Dec 2019 15:10:54: 16000000 INFO @ Thu, 05 Dec 2019 15:10:54: 18000000 INFO @ Thu, 05 Dec 2019 15:11:01: 17000000 INFO @ Thu, 05 Dec 2019 15:11:02: 19000000 INFO @ Thu, 05 Dec 2019 15:11:03: 17000000 INFO @ Thu, 05 Dec 2019 15:11:09: 18000000 INFO @ Thu, 05 Dec 2019 15:11:12: 20000000 INFO @ Thu, 05 Dec 2019 15:11:12: 18000000 INFO @ Thu, 05 Dec 2019 15:11:16: 19000000 INFO @ Thu, 05 Dec 2019 15:11:22: 19000000 INFO @ Thu, 05 Dec 2019 15:11:22: 21000000 INFO @ Thu, 05 Dec 2019 15:11:24: 20000000 INFO @ Thu, 05 Dec 2019 15:11:31: 22000000 INFO @ Thu, 05 Dec 2019 15:11:32: 21000000 INFO @ Thu, 05 Dec 2019 15:11:33: 20000000 INFO @ Thu, 05 Dec 2019 15:11:39: 22000000 INFO @ Thu, 05 Dec 2019 15:11:39: 23000000 INFO @ Thu, 05 Dec 2019 15:11:44: 21000000 INFO @ Thu, 05 Dec 2019 15:11:46: 23000000 INFO @ Thu, 05 Dec 2019 15:11:48: 24000000 INFO @ Thu, 05 Dec 2019 15:11:55: 24000000 INFO @ Thu, 05 Dec 2019 15:11:56: 22000000 INFO @ Thu, 05 Dec 2019 15:11:57: 25000000 INFO @ Thu, 05 Dec 2019 15:12:02: 25000000 INFO @ Thu, 05 Dec 2019 15:12:05: 26000000 INFO @ Thu, 05 Dec 2019 15:12:08: 23000000 INFO @ Thu, 05 Dec 2019 15:12:10: 26000000 INFO @ Thu, 05 Dec 2019 15:12:13: 27000000 INFO @ Thu, 05 Dec 2019 15:12:17: 27000000 INFO @ Thu, 05 Dec 2019 15:12:17: 24000000 INFO @ Thu, 05 Dec 2019 15:12:21: 28000000 INFO @ Thu, 05 Dec 2019 15:12:24: 28000000 INFO @ Thu, 05 Dec 2019 15:12:29: 29000000 INFO @ Thu, 05 Dec 2019 15:12:29: 25000000 INFO @ Thu, 05 Dec 2019 15:12:31: 29000000 INFO @ Thu, 05 Dec 2019 15:12:37: 30000000 INFO @ Thu, 05 Dec 2019 15:12:38: 30000000 INFO @ Thu, 05 Dec 2019 15:12:41: 26000000 INFO @ Thu, 05 Dec 2019 15:12:45: 31000000 INFO @ Thu, 05 Dec 2019 15:12:46: 31000000 INFO @ Thu, 05 Dec 2019 15:12:51: 27000000 INFO @ Thu, 05 Dec 2019 15:12:54: 32000000 INFO @ Thu, 05 Dec 2019 15:12:54: 32000000 INFO @ Thu, 05 Dec 2019 15:13:00: 28000000 INFO @ Thu, 05 Dec 2019 15:13:01: 33000000 INFO @ Thu, 05 Dec 2019 15:13:01: 33000000 INFO @ Thu, 05 Dec 2019 15:13:11: 34000000 INFO @ Thu, 05 Dec 2019 15:13:11: 34000000 INFO @ Thu, 05 Dec 2019 15:13:14: 29000000 INFO @ Thu, 05 Dec 2019 15:13:20: 35000000 INFO @ Thu, 05 Dec 2019 15:13:20: 35000000 INFO @ Thu, 05 Dec 2019 15:13:23: 30000000 INFO @ Thu, 05 Dec 2019 15:13:28: 36000000 INFO @ Thu, 05 Dec 2019 15:13:29: 36000000 INFO @ Thu, 05 Dec 2019 15:13:31: 31000000 INFO @ Thu, 05 Dec 2019 15:13:35: 37000000 INFO @ Thu, 05 Dec 2019 15:13:37: 37000000 INFO @ Thu, 05 Dec 2019 15:13:41: 32000000 INFO @ Thu, 05 Dec 2019 15:13:42: 38000000 INFO @ Thu, 05 Dec 2019 15:13:44: 38000000 INFO @ Thu, 05 Dec 2019 15:13:49: 39000000 INFO @ Thu, 05 Dec 2019 15:13:52: 33000000 INFO @ Thu, 05 Dec 2019 15:13:52: 39000000 INFO @ Thu, 05 Dec 2019 15:13:56: 40000000 INFO @ Thu, 05 Dec 2019 15:14:00: 40000000 INFO @ Thu, 05 Dec 2019 15:14:02: 34000000 INFO @ Thu, 05 Dec 2019 15:14:03: 41000000 INFO @ Thu, 05 Dec 2019 15:14:07: 41000000 INFO @ Thu, 05 Dec 2019 15:14:11: 35000000 INFO @ Thu, 05 Dec 2019 15:14:11: 42000000 INFO @ Thu, 05 Dec 2019 15:14:15: 42000000 INFO @ Thu, 05 Dec 2019 15:14:18: 43000000 INFO @ Thu, 05 Dec 2019 15:14:19: 36000000 INFO @ Thu, 05 Dec 2019 15:14:22: 43000000 INFO @ Thu, 05 Dec 2019 15:14:25: 44000000 INFO @ Thu, 05 Dec 2019 15:14:28: 37000000 INFO @ Thu, 05 Dec 2019 15:14:30: 44000000 INFO @ Thu, 05 Dec 2019 15:14:32: 45000000 INFO @ Thu, 05 Dec 2019 15:14:37: 38000000 INFO @ Thu, 05 Dec 2019 15:14:37: 45000000 INFO @ Thu, 05 Dec 2019 15:14:39: 46000000 INFO @ Thu, 05 Dec 2019 15:14:45: 46000000 INFO @ Thu, 05 Dec 2019 15:14:46: 39000000 INFO @ Thu, 05 Dec 2019 15:14:46: 47000000 INFO @ Thu, 05 Dec 2019 15:14:53: 47000000 INFO @ Thu, 05 Dec 2019 15:14:53: 48000000 INFO @ Thu, 05 Dec 2019 15:14:55: 40000000 INFO @ Thu, 05 Dec 2019 15:15:00: 48000000 INFO @ Thu, 05 Dec 2019 15:15:00: 49000000 INFO @ Thu, 05 Dec 2019 15:15:04: 41000000 INFO @ Thu, 05 Dec 2019 15:15:07: 50000000 INFO @ Thu, 05 Dec 2019 15:15:08: 49000000 INFO @ Thu, 05 Dec 2019 15:15:13: 42000000 INFO @ Thu, 05 Dec 2019 15:15:15: 51000000 INFO @ Thu, 05 Dec 2019 15:15:16: 50000000 INFO @ Thu, 05 Dec 2019 15:15:22: 43000000 INFO @ Thu, 05 Dec 2019 15:15:22: 52000000 INFO @ Thu, 05 Dec 2019 15:15:23: 51000000 INFO @ Thu, 05 Dec 2019 15:15:29: 53000000 INFO @ Thu, 05 Dec 2019 15:15:31: 52000000 INFO @ Thu, 05 Dec 2019 15:15:33: 44000000 INFO @ Thu, 05 Dec 2019 15:15:41: 54000000 INFO @ Thu, 05 Dec 2019 15:15:43: 53000000 INFO @ Thu, 05 Dec 2019 15:15:46: 45000000 INFO @ Thu, 05 Dec 2019 15:15:48: 55000000 INFO @ Thu, 05 Dec 2019 15:15:51: 54000000 INFO @ Thu, 05 Dec 2019 15:15:55: 46000000 INFO @ Thu, 05 Dec 2019 15:15:55: 56000000 INFO @ Thu, 05 Dec 2019 15:15:59: 55000000 INFO @ Thu, 05 Dec 2019 15:16:03: 57000000 INFO @ Thu, 05 Dec 2019 15:16:05: 47000000 INFO @ Thu, 05 Dec 2019 15:16:06: 56000000 INFO @ Thu, 05 Dec 2019 15:16:10: 58000000 INFO @ Thu, 05 Dec 2019 15:16:12: #1 tag size is determined as 75 bps INFO @ Thu, 05 Dec 2019 15:16:12: #1 tag size = 75 INFO @ Thu, 05 Dec 2019 15:16:12: #1 total tags in treatment: 58265171 INFO @ Thu, 05 Dec 2019 15:16:12: #1 user defined the maximum tags... INFO @ Thu, 05 Dec 2019 15:16:12: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Thu, 05 Dec 2019 15:16:14: #1 tags after filtering in treatment: 58265171 INFO @ Thu, 05 Dec 2019 15:16:14: #1 Redundant rate of treatment: 0.00 INFO @ Thu, 05 Dec 2019 15:16:14: #1 finished! INFO @ Thu, 05 Dec 2019 15:16:14: #2 Build Peak Model... INFO @ Thu, 05 Dec 2019 15:16:14: #2 looking for paired plus/minus strand peaks... INFO @ Thu, 05 Dec 2019 15:16:14: 48000000 INFO @ Thu, 05 Dec 2019 15:16:14: 57000000 INFO @ Thu, 05 Dec 2019 15:16:18: #2 number of paired peaks: 0 WARNING @ Thu, 05 Dec 2019 15:16:18: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Thu, 05 Dec 2019 15:16:18: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.20_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 1 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.20_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.20_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.20_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Thu, 05 Dec 2019 15:16:22: 58000000 INFO @ Thu, 05 Dec 2019 15:16:23: 49000000 INFO @ Thu, 05 Dec 2019 15:16:25: #1 tag size is determined as 75 bps INFO @ Thu, 05 Dec 2019 15:16:25: #1 tag size = 75 INFO @ Thu, 05 Dec 2019 15:16:25: #1 total tags in treatment: 58265171 INFO @ Thu, 05 Dec 2019 15:16:25: #1 user defined the maximum tags... INFO @ Thu, 05 Dec 2019 15:16:25: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Thu, 05 Dec 2019 15:16:26: #1 tags after filtering in treatment: 58265171 INFO @ Thu, 05 Dec 2019 15:16:26: #1 Redundant rate of treatment: 0.00 INFO @ Thu, 05 Dec 2019 15:16:26: #1 finished! INFO @ Thu, 05 Dec 2019 15:16:26: #2 Build Peak Model... INFO @ Thu, 05 Dec 2019 15:16:26: #2 looking for paired plus/minus strand peaks... INFO @ Thu, 05 Dec 2019 15:16:31: #2 number of paired peaks: 0 WARNING @ Thu, 05 Dec 2019 15:16:31: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Thu, 05 Dec 2019 15:16:31: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.10_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.10_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.10_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.10_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling INFO @ Thu, 05 Dec 2019 15:16:31: 50000000 INFO @ Thu, 05 Dec 2019 15:16:41: 51000000 INFO @ Thu, 05 Dec 2019 15:16:50: 52000000 INFO @ Thu, 05 Dec 2019 15:17:01: 53000000 INFO @ Thu, 05 Dec 2019 15:17:19: 54000000 INFO @ Thu, 05 Dec 2019 15:17:38: 55000000 INFO @ Thu, 05 Dec 2019 15:17:49: 56000000 INFO @ Thu, 05 Dec 2019 15:17:59: 57000000 INFO @ Thu, 05 Dec 2019 15:18:10: 58000000 INFO @ Thu, 05 Dec 2019 15:18:13: #1 tag size is determined as 75 bps INFO @ Thu, 05 Dec 2019 15:18:13: #1 tag size = 75 INFO @ Thu, 05 Dec 2019 15:18:13: #1 total tags in treatment: 58265171 INFO @ Thu, 05 Dec 2019 15:18:13: #1 user defined the maximum tags... INFO @ Thu, 05 Dec 2019 15:18:13: #1 filter out redundant tags at the same location and the same strand by allowing at most 1 tag(s) INFO @ Thu, 05 Dec 2019 15:18:15: #1 tags after filtering in treatment: 58265171 INFO @ Thu, 05 Dec 2019 15:18:15: #1 Redundant rate of treatment: 0.00 INFO @ Thu, 05 Dec 2019 15:18:15: #1 finished! INFO @ Thu, 05 Dec 2019 15:18:15: #2 Build Peak Model... INFO @ Thu, 05 Dec 2019 15:18:15: #2 looking for paired plus/minus strand peaks... INFO @ Thu, 05 Dec 2019 15:18:23: #2 number of paired peaks: 0 WARNING @ Thu, 05 Dec 2019 15:18:23: Too few paired peaks (0) so I can not build the model! Broader your MFOLD range parameter may erase this error. If it still can't build the model, we suggest to use --nomodel and --extsize 147 or other fixed number instead. WARNING @ Thu, 05 Dec 2019 15:18:23: Process for pairing-model is terminated! cut: /home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.05_peaks.narrowPeak: No such file or directory pass1 - making usageList (0 chroms): 2 millis needLargeMem: trying to allocate 0 bytes (limit: 17179869184) rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.05_model.r’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.05_*.xls’: No such file or directory rm: cannot remove ‘/home/okishinya/chipatlas/results/dm3/SRX7158300/SRX7158300.05_peaks.narrowPeak’: No such file or directory CompletedMACS2peakCalling BedGraph に変換しました。 BigWig に変換中... BigWig に変換しました。