Pavel Afonine wrote:
Hi,
thanks for your questions!
1) What's the reason for refining bss, xyz and adp in separate cycles? CNS does that too while refmac (I believe) refines them simultaneously. I suspect I missed something, but I thought the power of ML refinement is that no refinable parameters are held constant if they're not supposed to be. (At least, wasn't that the big deal in Sharp when it came out?)
It is common practice to refine bulk solvent and scale, coordinates, ADPs, an other parameters separately. There are numerical issues behind this (see for example: Acta Cryst. (1978). A34, 791-809; /Acta Cryst./ (2005). D*61*, 850-855; ...). CNS does exactly the same: it refines this parameters separately. To my knowledge, same for REFMAC. This is not target specific. The power of ML refinement is that ML target statistically models missing scatterers in a model and and errors (/Acta Cryst./ (2002). A*58*, 270-282).
The sets of parameters that can be refined simultaneously is a result of the choice of optimization method, not the choice of the function being optimized. The crucial difference is how the second derivatives are handled. In the methods used in Phenix and CNS the seconds derivatives of all parameters are assumed to be equal and uncorrelated. To ensure that this assumption is true only parameters of the same category can be varied in a single cycle. This means that the coordinates can be varied, but the B factors and scale factors have to be held fixed. When the B factors are varied the coordinates and scale factors must be constant. When the scale factors are varied everything else must be fixed. Other refinement programs use second derivatives in a more explicit way than Phenix and do vary more types of parameters in a single cycle. Shelxd is the most powerful and, by default, varies all parameters of all classes each cycle. Refmac, to the best of my knowledge, refines both coordinates and ADPs together, but does refine TLS parameters in a separate step. Since all the parameters of our models are correlated with one another, it is better to refine as many of them at once as possible. Implementing the join refinement of all these kinds of parameters is difficult, so to save programmers' time approximations are sometimes made. Dale Tronrud