The coordinates and B-factors shaking itself take fractions of a second. The question is how many refinement macro-cycles you will need to perform after that to make sure the memory is removed and proper R/Rfree values are recovered back (5 or 10 or 50 or ...?). As for the second part of your question -- I don't know and I don't think there is a definitive answer to this. I never did any comparisons. Pavel. Jianghai Zhu wrote:
I saw cases where even simulated annealing didn't help; but this is relatively rare. You can do shake coordinates and B-factors and do refinement as many macro-cycles as necessary (eg., until the gap Rfree-Rwork does not grow anymore):
phenix.refine model.pdb data.mtz modify_start_model.sites.shake=1.0 modify_start_model.adp.randomize=true
Instead of modify_start_model.adp.randomize=true you can use modify_start_model.adp.set_b_iso=25.0
The values in modify_start_model.sites.shake=1.0 is something to play with, but something like 1.0 ... 1.5 is ok in most of cases.
Is this coordiantes and B-factors shaking as effective as SA? I believe it should be much faster. _______________________________________________ phenixbb mailing list [email protected] http://www.phenix-online.org/mailman/listinfo/phenixbb