You are here

error extracting PDB from silent from AbinitioRelax with constraints

3 posts / 0 new
Last post
error extracting PDB from silent from AbinitioRelax with constraints
#1

I am using Rosetta 3.8 under Ubuntu 16.04 64 bit. I performed AbinitioRelax to predict structure for my protein with certain AtomPair constraints. I also applied filters by using -abinitio::use_filters true

First, I notice there are different numbers of items in the scores generated for "failed" and "successful" decoys.

For failed decoys (with F_ in the tags), there are 33 items, and for successful decoys (with S_ in the tags), there are 34 items ( coordinate_constraint is the extra one). 

SCORE:     score     fa_atr     fa_rep     fa_sol    fa_intra_rep    fa_intra_sol_xover4    lk_ball_wtd    fa_elec    pro_close    hbond_sr_bb    hbond_lr_bb    hbond_bb_sc    hbond_sc    dslf_fa13    atom_pair_constraint    angle_constraint    dihedral_constraint      omega     fa_dun    p_aa_pp    yhh_planarity        ref    rama_prepro    Filter_Stage2_aBefore    Filter_Stage2_bQuarter    Filter_Stage2_cHalf    Filter_Stage2_dEnd    co    clashes_total    clashes_bb       time description

SCORE:     score     fa_atr     fa_rep     fa_sol    fa_intra_rep    fa_intra_sol_xover4    lk_ball_wtd    fa_elec    pro_close    hbond_sr_bb    hbond_lr_bb    hbond_bb_sc    hbond_sc    dslf_fa13    atom_pair_constraint    coordinate_constraint    angle_constraint    dihedral_constraint      omega     fa_dun    p_aa_pp    yhh_planarity        ref    rama_prepro    Filter_Stage2_aBefore    Filter_Stage2_bQuarter    Filter_Stage2_cHalf    Filter_Stage2_dEnd    co    clashes_total    clashes_bb       time description

Secondly, in the silent file containing multiple decoys, depending on the first decoy in the silent file, the scoring header line will have 33 or 34 items. If the first decoy happends to be a failed decoy, it will has 33 items on the 2nd line of the silent file. So the extract_pdb application will not be able to correctly parse tags from the 34th column (though the tags are on the 33rd col) and spit a "empty tag" error:

core.io.silent: parse error(   1 L     0.000  141.088  176.700    1.458    0.000    0.000 -176.354   64.289 -147.678    0.000 S_00000003) S_00000003 != empty_tag
 

This can be fixed by manually repeating the tag in the 33rd col to a fake 34th col in the silent file. Is this perhaps a bug?

 

 

Post Situation: 
Wed, 2017-06-14 04:15
attesor

Definitely a bug.  I know that someone has been working on bugs in unextractable silent files recently - I'll bring this to his attention.

(For what it's worth, the whole existence of silent files is due to the "bug" of filesystem design not supporting the needs of computational modeling.  PDBs are superior but they make sysadmins yell about inodes and broken hard drives.  (yes, I have strong opinions here).)

Thu, 2017-06-15 12:07
smlewis

Vikram tells me:

 

" I think that I've fixed this particular bug post-3.8.  I don't know what causes the write side (I haven't managed to track that down), but it's a longstanding bug, and I've made the reader robust to it.  Tell the user to get the latest weekly, and he/she should be fine reading in the malformatted silent files."

Fri, 2017-06-16 07:35
smlewis