Refinement with partial model - question
Dear Phenix Users, I have a partial model at low resolution. Is there a way let Phenix.refine know the model is not complete and that when a mask is calculated for bulk solvent correction it should take this fact into consideration ? Alternatively, can I use a solvent mask from density modification program in Phenix.refine somehow ? Alternatively, is there a program that will build dummy atoms (zero occupancy) automatically in a map in all peaks above a certain sigma (i.e. atoms just for mask calculation). Thank you in advance, Peter.
Hi Peter,
I have a partial model at low resolution. Is there a way let Phenix.refine know the model is not complete and that when a mask is calculated for bulk solvent correction it should take this fact into consideration ?
No. Bulk solvent mask is set up everywhere where there is no model. May be it is a reasonable approximation at early stage of refinement to have something in that area than vacuum?
Alternatively, is there a program that will build dummy atoms (zero occupancy) automatically in a map in all peaks above a certain sigma (i.e. atoms just for mask calculation).
I'm not aware of such program although it's a couple of hours to write one. I would probably even write it if I ever see a clear need for it or evidence for its usefulness. So far I don't understand why you really need it? Pavel.
Hi, Pavel and Peter,
Alternatively, is there a program that will build dummy atoms (zero occupancy) automatically in a map in all peaks above a certain sigma (i.e. atoms just for mask calculation).
I'm not aware of such program although it's a couple of hours to write one. I would probably even write it if I ever see a clear need for it or evidence for its usefulness. So far I don't understand why you really need it?
I think Peter touches the point that we discussed previously, and this is one more reason to think about it in practical terms : it would be nice if phenix has an alternative option NOT to build a mask from the model but to read it outside. For example, it may be an EM envelope, or a mask built by any automated procedure. An intermediate option is that indeed phenix gives a procedure to fill an external mask by dummy atoms with occ=0 and then you add them to you "regular" model and build the mask by your usual procedure. Would it be easier and faster ? I understand that any solution requires your personal time (as usually, your initial time estimations multiplied by 10 :-) Best regards, Sacha
I think Peter touches the point that we discussed previously, and this is one more reason to think about it in practical terms : it would be nice if phenix has an alternative option NOT to build a mask from the model but to read it outside. For example, it may be an EM envelope, or a mask built by any automated procedure.
I still don't see why it is necessary / important. Also, if you know a better way of mask calculation (or alternative way that sometimes may be better), then I would rather implement that then spend time on an option to allow a user-made mask.
An intermediate option is that indeed phenix gives a procedure to fill an external mask by dummy atoms with occ=0 and then you add them to you "regular" model and build the mask by your usual procedure.
If I see a case where it really makes positive difference I will think of implementing it -:) Pavel.
On Thu, 2010-04-01 at 17:06 -0700, Pavel Afonine wrote:
I still don't see why it is necessary / important. Also, if you know a better way of mask calculation (or alternative way that sometimes may be better), then I would rather implement that then spend time on an option to allow a user-made mask.
It may be important to those who would like to experiment with masks but don't have time or interest in creating their own crystal structure refinement software. For instance, long time ago I, like many others before and after, have discovered that sometimes there are empty cavities inside proteins, which get filled with bulk solvent, resulting in negative density blobs. Not a big deal, really, but it was possible to appease the inner nitpicker by generating a mask in CNS, running it through USF's MAMA to remove isolated islands and supplying it to CNS refinement as an external mask. I haven't seen much change in R-factors (naturally), but the negative density was gone. Now, I can suggest this as a "better way of mask calculation" (and who knows, maybe it's already implemented although I can't find anything suggesting it on the website). Expectation is that you have to be convinced that it works at least in some cases. Catch-22: suggestions will be implemented only if evidence is provided to demonstrate that suggested changes in the algorithm have some effect. Which can't be done without implementing the suggestion. Ultimately, this question tests what is the philosophy of phenix as the software. Specifically, if software users are encouraged to experiment with possible improvements, or such activities are exclusive to developer team. Cheers, Ed. -- Edwin Pozharski, PhD, Assistant Professor University of Maryland, Baltimore ---------------------------------------------- When the Way is forgotten duty and justice appear; Then knowledge and wisdom are born along with hypocrisy. When harmonious relationships dissolve then respect and devotion arise; When a nation falls to chaos then loyalty and patriotism are born. ------------------------------ / Lao Tse /
Hi Pavel,
Refinement of severely partial models can be greatly facilitated by some way
of describing the missing parts or at least knowing were there is solvent
and where protein.
Some of these points are described in "Refinement of severely incomplete
structures with maximum likelihood in *BUSTER-TNT"* E.
Blanchttp://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Blanc,%20E.,
P. Roversihttp://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Roversi,%20P.,
C. Vonrheinhttp://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Vonrhein,%20C.,
C. Flensburghttp://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Flensburg,%20C.,
S. M. Leahttp://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Lea,%20S.M.and
G.
Bricogne (2004)http://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Bricogne,%20G.
* Acta Cryst.* (2004). D*60*, 2210-2221.
As for better ways to calculate masks for severely partial/wrong models:
EM/SAXS models of the entire structure.
When NCS is available, density modification+averaging will yield a mask that
may cover the entire protein.
Peter.
On Fri, Apr 2, 2010 at 5:00 PM, Ed Pozharski
On Thu, 2010-04-01 at 17:06 -0700, Pavel Afonine wrote:
I still don't see why it is necessary / important. Also, if you know a better way of mask calculation (or alternative way that sometimes may be better), then I would rather implement that then spend time on an option to allow a user-made mask.
It may be important to those who would like to experiment with masks but don't have time or interest in creating their own crystal structure refinement software.
For instance, long time ago I, like many others before and after, have discovered that sometimes there are empty cavities inside proteins, which get filled with bulk solvent, resulting in negative density blobs. Not a big deal, really, but it was possible to appease the inner nitpicker by generating a mask in CNS, running it through USF's MAMA to remove isolated islands and supplying it to CNS refinement as an external mask. I haven't seen much change in R-factors (naturally), but the negative density was gone.
Now, I can suggest this as a "better way of mask calculation" (and who knows, maybe it's already implemented although I can't find anything suggesting it on the website). Expectation is that you have to be convinced that it works at least in some cases. Catch-22: suggestions will be implemented only if evidence is provided to demonstrate that suggested changes in the algorithm have some effect. Which can't be done without implementing the suggestion.
Ultimately, this question tests what is the philosophy of phenix as the software. Specifically, if software users are encouraged to experiment with possible improvements, or such activities are exclusive to developer team.
Cheers,
Ed.
-- Edwin Pozharski, PhD, Assistant Professor University of Maryland, Baltimore ---------------------------------------------- When the Way is forgotten duty and justice appear; Then knowledge and wisdom are born along with hypocrisy. When harmonious relationships dissolve then respect and devotion arise; When a nation falls to chaos then loyalty and patriotism are born. ------------------------------ / Lao Tse /
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
-- Peter
Hi Peter,
Refinement of severely partial models can be greatly facilitated by some way of describing the missing parts or at least knowing were there is solvent and where protein. Some of these points are described in "Refinement of severely incomplete structures with maximum likelihood in /BUSTER-TNT"/ E. Blanc http://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Blanc,%20E., P. Roversi http://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Roversi,%20P., C. Vonrhein http://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Vonrhein,%20C., C. Flensburg http://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Flensburg,%20C., S. M. Lea http://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Lea,%20S.M. and G. Bricogne (2004) http://scripts.iucr.org/cgi-bin/citedin?search_on=name&author_name=Bricogne,%20G./ Acta Cryst./ (2004). D*60*, 2210-2221.
I agree. But if I remember correctly, they set up some a priori information in these areas and not just leave them empty. So when you let bulk solvent fill in these areas you assume that there is some density there, otherwise you assume that there is nothing there (which I guess is worse). Pavel.
Hi Ed,
It may be important to those who would like to experiment with masks but don't have time or interest in creating their own crystal structure refinement software.
True.
For instance, long time ago I, like many others before and after, have discovered that sometimes there are empty cavities inside proteins, which get filled with bulk solvent, resulting in negative density blobs
Not that long ago this problem was brought up by Dirk and Morten, and a few more people.
Now, I can suggest this as a "better way of mask calculation" (and who knows, maybe it's already implemented although I can't find anything suggesting it on the website). Expectation is that you have to be convinced that it works at least in some cases. Catch-22: suggestions will be implemented only if evidence is provided to demonstrate that suggested changes in the algorithm have some effect. Which can't be done without implementing the suggestion.
Yes, it's another item in the list of things to do. It just has lower priority and so will not appear tomorrow.
Ultimately, this question tests what is the philosophy of phenix as the software. Specifically, if software users are encouraged to experiment with possible improvements, or such activities are exclusive to developer team.
It's for a broad range of users: from experimenters (this is why phenix.refine has more than 300 parameters available to play with), to newbies that can just run "phenix.refine model.pdb data.mtz". Plus, if you have some knowledge of Python and crystallography methods, you can write your own refinement program using cctbx and mmtbx librarier - just use them as Lego constructor. And don't forget to experiment along the way. We have our own endless list of interesting things to implement/try out, plus we get great ideas from the user community. If we spend our developer's time on doing them all and now, we will never get anywhere. So we have to be selective and work only on those that are the most important. What's "important" is defined by the amount of user requests or by our own considerations. Pavel.
Hi I converted mtz to xplor file and did the refinement . Also trying to use mtz file as it self, phenix complaining about MDchecksum Rfree. I do have F, SIGF, I, IMEAN , Rfree-flags in the mtz file, converted to xplor format, phenix did not give me any warning about MDchecksum, Any suggestion thanks ram
I converted mtz to xplor file and did the refinement . Also trying to use mtz file as it self, phenix complaining about MDchecksum Rfree. I do have F, SIGF, I, IMEAN , Rfree-flags in the mtz file, converted to xplor format, phenix did not give me any warning about MDchecksum,
That's odd. I'd expect phenix.refine to work with either reflection file and to pickup the correct value for the test set flag automatically. If you send me the files (off-list) I'll take a look. Ralf
Thanks, I will send you the mtz file, paper is under progress , so will send you asap.
I like to know !
1. If I use mtz as default, which contains following labels, F, SIGF, IMEAN, ISIGMA, RFRee flags.
Why phenix picks up IMEAN instead of Fs.
The number of reflections used for refinements are different after IMEAN is converted to Fobs by phenix..
With IMEAN , the number of reflection is 8770
output from phenix .log Number of F-obs in resolution range: 9120
Number of F-obs <= 0: 377
I used mtz after phenix run that convert Imean to Fobs, numberof reflections : 9121
same number of Rfree various by 15 reflections in both case.
I run both phenix with default
phenix.refine model.pdb x.mtz
any suggestions.
thanks
ram
________________________________
From: Ralf W. Grosse-Kunstleve
I converted mtz to xplor file and did the refinement . Also trying to use mtz file as it self, phenix complaining about MDchecksum Rfree. I do have F, SIGF, I, IMEAN , Rfree-flags in the mtz file, converted to xplor format, phenix did not give me any warning about MDchecksum,
That's odd. I'd expect phenix.refine to work with either reflection file and to pickup the correct value for the test set flag automatically. If you send me the files (off-list) I'll take a look. Ralf _______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
1. If I use mtz as default, which contains following labels, F, SIGF, IMEAN, ISIGMA, RFRee flags. Why phenix picks up IMEAN instead of Fs.
Because Imean is assumed to be more directly related to the experimental observations. -- It is a fairly weak argument, but a decision had to be made somehow. You can force phenix.refine to use F: phenix.refine xray_data.labels=F,SIGF
I run both phenix with default phenix.refine model.pdb x.mtz
I'd need the files to see what's happening. Ralf
Hello all, I am refining a structure which contains several uranyl sites. I know the atoms should be there based on peaks in anomalous difference fourier maps, peak heights are greater than 10 sigma. Likewise we have solved the apo-structure before with the same heavy atoms. If I perform simulated annealing, the heteroatoms get pushed as far as 8 angstroms away. The Asp(s) and Glu residues that coordinate the uranyl ion then just move into the region of the map where the U has vacated. The resulting fofc difference map has a huge peak(>11 sigma) where the U should be. I noticed that if I inflate the starting b-factor for the U, the positions will stay put(relatively). What is the best thing to do here? Can I fix their positions, so that this does not happen? How do I do it? I found a post in the archives that says i can use: refine.sites.individual="not element U". Will this work for the SA? I know it works during refinement. Anyway, the data is to 3 angstrom. Here is the result from some SA runs: following simulated annealing: no u, starting b-factor for all atoms 20: Final R-work = 0.2502, R-free = 0.2899 with u starting b-factor 20: Final R-work = 0.2771, R-free = 0.3094 with u inflated starting b-factor for U: Final R-work = 0.2682, R-free = 0.2983 After further manual rebuilding and TLS refinement(with u): Final R-work = 0.2478, R-free = 0.2849 Also, during further refinement(if I reposition the U(s)), some of the U(s) still get displaced and the b-factors get extremely high, some as high as 500. I think this is due to them being displaced though. Other datasets which I have refined with phenix don't exhibit this behavior. Could it have something to do with NCS restraints on the protein? any suggests? Thanks in advance- Todd
Hi I wanted to do low resolution rigid body refinement with secondary structural restrain. I have not seen any command or information in the phenix document? any help Thanks ram
Hi, rigid body refinement does not employ any restraints. You can use secondary structure restraints in refinement of individual coordinates. Actually, you can use secondary structure restraints in constrained rigid body refinement (SA in torsion angle space). To use secondary structure restraints: "secondary_structure_restraints=true". phenix.refine GUI should have an option for this too. Pavel. On 4/7/10 4:38 PM, r n wrote:
Hi I wanted to do low resolution rigid body refinement with secondary structural restrain. I have not seen any command or information in the phenix document? any help Thanks ram****
------------------------------------------------------------------------
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
On Wed, Apr 7, 2010 at 4:48 PM, Pavel Afonine
rigid body refinement does not employ any restraints. You can use secondary structure restraints in refinement of individual coordinates.
Actually, you can use secondary structure restraints in constrained rigid body refinement (SA in torsion angle space).
I'm not sure to what extent this is true. Ralf, what would be the effect of those extra H-bonds when running torsion angle dynamics? Will they be used in the rigidity analysis, or will they just behave as they would in cartesian dynamics? -Nat
I'm not sure to what extent this is true. Ralf, what would be the effect of those extra H-bonds when running torsion angle dynamics? Will they be used in the rigidity analysis, or will they just behave as they would in cartesian dynamics?
The rigidity analysis excludes bonds with a slack != 0; they will only be maintained as restraints. All bonds with slack == 0 are treated the same, no matter how they were defined (cif or custom or any other way). Ralf
Hello list, i would like to add a following discussion to yesterday's low resolution refinement question: What is low res? at what resolution should we use only rigid body refinement or can we use minimization as well? In my case, i have a 2.7A and a 3.7A structures, of 2 different mutants, in the same space groups and cell parameters. After refining the high res one, i'm wondering what is the best strategy to refine the low res one? If i do a simple rigid body refinement, I have R=32.1, Rfree=35.8. If i use individual site refinement, i have R=30.7, Rfree=35.8. (both with group ADP, TLS and NCS). The statistics look good and the maps look great. There isn't any feature in the difference map so i'm guessing i'm done with refinement? (I'm afraid of over-refining it as the map look too good for this resolution. ) I was also advised to add hydrogen bond restraints to maintain secondary structures, I guess upon using minimization refinement? Basically, could you update me on what's today's dogma on what type of refinement we should use at what resolution? ( i couldn't find this info on the phenix web site, sorry if i missed it) thank you very much. vincent Pavel Afonine wrote:
Hi,
rigid body refinement does not employ any restraints. You can use secondary structure restraints in refinement of individual coordinates.
Actually, you can use secondary structure restraints in constrained rigid body refinement (SA in torsion angle space).
To use secondary structure restraints: "secondary_structure_restraints=true". phenix.refine GUI should have an option for this too.
Pavel.
On 4/7/10 4:38 PM, r n wrote:
Hi I wanted to do low resolution rigid body refinement with secondary structural restrain. I have not seen any command or information in the phenix document? any help Thanks ram****
------------------------------------------------------------------------
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
-- Vincent Chaptal Dept. of Physiology at UCLA http://www.physiology.ucla.edu/Labs/Abramson/index.html http://www.physiology.ucla.edu/Labs/Abramson/index.html/ Phone: 1-310-206-1399 IMPORTANT WARNING: This email (and any attachments) is only intended for the use of the person or entity to which it is addressed, and may contain information that is privileged and confidential. You, the recipient, are obligated to maintain it in a safe, secure and confidential manner. Unauthorized redisclosure or failure to maintain confidentiality may subject you to federal and state penalties. If you are not the intended recipient, please immediately notify us by return email, and delete this message from your computer.
Hi Vincent,
I can tell you from my work with a 5A virus capsid that NCS can make your maps look much better than the resolution dictates. You didn't say how much NCS you have, but you may also consider NCS averaging as a tool. Also be aware of high NCS contaminating the free set of reflections- I think Phenix can auto-magically pick in thin shells to mitigate this effect.
-Christina
________________________________
From: vincent Chaptal
Hi,
rigid body refinement does not employ any restraints. You can use secondary structure restraints in refinement of individual coordinates.
Actually, you can use secondary structure restraints in constrained rigid body refinement (SA in torsion angle space).
To use secondary structure restraints: "secondary_structure_restraints=true". phenix.refine GUI should have an option for this too.
Pavel.
On 4/7/10 4:38 PM, r n wrote:
Hi I wanted to do low resolution rigid body refinement with secondary structural restrain. I have not seen any command or information in the phenix document? any help Thanks ram****
------------------------------------------------------------------------
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
-- Vincent Chaptal Dept. of Physiology at UCLA http://www.physiology.ucla.edu/Labs/Abramson/index.html http://www.physiology.ucla.edu/Labs/Abramson/index.html/ Phone: 1-310-206-1399 IMPORTANT WARNING: This email (and any attachments) is only intended for the use of the person or entity to which it is addressed, and may contain information that is privileged and confidential. You, the recipient, are obligated to maintain it in a safe, secure and confidential manner. Unauthorized redisclosure or failure to maintain confidentiality may subject you to federal and state penalties. If you are not the intended recipient, please immediately notify us by return email, and delete this message from your computer. _______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
On Fri, Apr 9, 2010 at 11:52 AM, Christina Bourne
I can tell you from my work with a 5A virus capsid that NCS can make your maps look much better than the resolution dictates. You didn't say how much NCS you have, but you may also consider NCS averaging as a tool. Also be aware of high NCS contaminating the free set of reflections- I think Phenix can auto-magically pick in thin shells to mitigate this effect.
Partially correct - it can pick in thin shells but this isn't automatic. Either the reflection file editor in the GUI or phenix.refine (when starting without a test set) will do this if requested. For general reading on low-resolution crystallography, there are a couple of recent reviews by Axel Brunger et al.: http://www.ncbi.nlm.nih.gov/pubmed/19171967 http://www.ncbi.nlm.nih.gov/pubmed/16855310 -Nat
Hi Vincent,
What is low res?
you may consider having a look at discussion about resolution limits presented here: Acta Cryst. (2009). D65, 1283-1291. These are good (in my opinion) publications about low resolution refinement challenges. Example of some of them: Acta Cryst. (2006). D62, 923-932 Considerations for the refinement of low-resolution crystal structures B. DeLaBarre and A. T. Brunger Low-Resolution Crystallography Is Coming of Age Structure, Volume 13, Issue 2, Pages 171-172 A. Brunger Acta Cryst. (2009). D65, 121-127 Model-building strategies for low-resolution X-ray crystallographic data A. M. Karmali, T. L. Blundell and N. Furnham G.F.Schroeder, M.Levitt, A.T.Brunger, Super-resolution biomolecular crystallography with low-resolution data. Nature, doi:10.1038/nature08892 (2010).
at what resolution should we use only rigid body refinement or can we use minimization as well?
Let's try avoid mixing unmixable first : model parameterization (rigid body, individual coordinates, etc.) and optimization algorithms (minimization, simulated annealing, etc.).
In my case, i have a 2.7A and a 3.7A structures,
2.7A resolution: - coordinates: At this resolution you definitely refine individual coordinates. Depending on how far your model is from the final one, you can use SA (if the model is still poor) or gradient-driven minimization (if the model quality is decent) as optimization choices. Use NCS if available, but exercise care when selecting NCS related groups. Phenix.refine can select NCS groups automatically, but it is always a good idea to review the automatic choice manually. You can also try refinement without using NCS but with tighter restraints - I've seen cases when it works much better. If you observe overfitting - optimize X-ray/restraints target weights (use optimize_wxc=true to do it automatically, or do it manually by playing with wxc_scale parameter). Also, it is a good idea to run a final refinement run (before PDB deposition) with weights optimization turned on. Use local real-space refinement (fix_rotamers=true option; more details: http://cci.lbl.gov/~afonine/rsr.pdf ) - ADPs (B-factors): Refine individual ADPs. Use TLS (phenix.refine does combined individual ADP + TLS refinement, which is the best option most of the time). Carefully select TLS groups. Here is how I typically use TLSMD (copy from one of my previous posts): 1. Reset all B-factors to a average value: phenix.pdbtools model.pdb set_b_iso=25 If model.pdb has anisotropic atoms (ANISOU records), then you need to convert them to isotropic: "convert_to_iso=true" 2. Refine group ADP only: phenix.refine model_all_B_25.pdb data.mtz strategy=group_adp group_adp_refinement_mode=two_adp_groups_per_residue 3. Submit the refined model to TLSMD 4. Discard the refined model (the one obtained at step #2). 5. Use selections for TLS groups in subsequent refinement. If you turn on NCS restraints, they will be applied to ADP as well (to residual B-factors, not total). Sometimes, very rarely, if the data is horribly incomplete and data-to-parameters ratio is bad, and tightening ADB restraints doesn't help, the group B-factor refinement (with one or two refinable isotropic ADP per residue) is an option to try. In my experience I had to do this in a refinement against neutron data. 3.7A resolution: Try above guidelines for 2.7A first (except probably "fix_rotamers"), and if it doesn't work, then: - coordinates: Try torsion angle parameterization (constrained rigid body), and SA is the only working optimization protocol available in phenix.refine for this type of parameterization. Try secondary structure restraints if you see that secondary structure gets distorted during refinement. - ADP refinement: try group ADP refinement (in combination with TLS) instead of highly restrained individual. If 3.7A structure is an analog of 2.7A structure, then you may use a higher resolution structure as a source of restrains for low resolution one (I think this option will be available soon in phenix.refine - check with Jeff Headd who is working on it). If you have high-order NCS, use "thin-shells" algorithm to assign free-R flags (available in PHENIX). This is not done by default - you need to turn this flag on yourself. Did you check for twinning? Optimize mask calculation (main.optimize_mask=True option). This may give you 1-3% better R-factor at low resolution. The next version of PHENIX will do it automatically.
If i do a simple rigid body refinement, I have R=32.1, Rfree=35.8. If i use individual site refinement, i have R=30.7, Rfree=35.8. (both with group ADP, TLS and NCS).
Here is how your numbers compare to the numbers for structures in PDB at similar resolution: phenix.r_factor_statistics 3.7 left_offset=0.2 right_offset=0.2 n_bins=5 Histogram of Rwork for models in PDB at resolution 3.50-3.90 A: 0.168 - 0.209 : 8 0.209 - 0.250 : 37 0.250 - 0.291 : 65 0.291 - 0.332 : 36 <<< your model 0.332 - 0.373 : 13 Histogram of Rfree for models in PDB at resolution 3.50-3.90 A: 0.184 - 0.231 : 2 0.231 - 0.278 : 23 0.278 - 0.324 : 72 0.324 - 0.371 : 51 <<< your model 0.371 - 0.418 : 11 Histogram of Rfree-Rwork for all model in PDB at resolution 3.50-3.90 A: 0.002 - 0.020 : 24 0.020 - 0.038 : 42 0.038 - 0.056 : 47 <<< your model 0.056 - 0.075 : 35 0.075 - 0.093 : 11 Number of structures considered: 159 Doesn't look too bad I would say.
Basically, could you update me on what's today's dogma on what type of refinement we should use at what resolution?
No dogma. Try a few possible refinement strategies given the resolution and model quality and see which one works best. This is more robust and time-efficient than shaking the air with speculations about what could be used and why. Once you have found the best refinement protocol for your particular case then you can spend some time reviewing what you have tried and why one strategy worked better than the other. Typically, you need to try 3-5 refinement protocols which is not a big deal given the speed of today's computers. Pavel.
Hi phenix, I am wondering what is the difference between an omit map that is produced by the refinement without a ligand and an omit map produced by autobuild. Maia
Hi Maia, If you do a simple omit map in phenix, omitting around a PDB containing the ligand, this will be very similar to refining without the ligand. All the best, Tom T
Hi phenix,
I am wondering what is the difference between an omit map that is produced by the refinement without a ligand and an omit map produced by autobuild.
Maia _______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi Maia, just a remark: probably the quickest and simplest option to see if your ligand is there is to do these two steps: 1) physically remove the ligand from PDB file; 2) compute Average Kick mFo-DFc Map using phenix.refine or phenix.maps. Pavel.
I am wondering what is the difference between an omit map that is produced by the refinement without a ligand and an omit map produced by autobuild.
Hi Pavel, sorry to send you this letter again, as I did not get the answer. I ran phenix.refine to produce a kicked map after deleting my ligand (to get an omit map). I could not make the Average Kick mFo-DFc Map using phenix.refine or phenix.maps. I got an error message. Sorry: Duplicate mtz_label_amplitudes:2FOFCWT Here is part of my .def file: How should I change it? map { mtz_label_amplitudes = None mtz_label_phases = None likelihood_weighted = None obs_factor = None calc_factor = None kicked = False fill_missing_f_obs_with_weighted_f_model = True } map { mtz_label_amplitudes = "2FOFCWT" mtz_label_phases = "PH2FOFCWT" likelihood_weighted = True obs_factor = 2 calc_factor = 1 kicked = True fill_missing_f_obs_with_weighted_f_model = True } map { mtz_label_amplitudes = "FOFCWT" mtz_label_phases = "PHFOFCWT" likelihood_weighted = True obs_factor = 1 calc_factor = 1 kicked = True fill_missing_f_obs_with_weighted_f_model = False } map { mtz_label_amplitudes = "2FOFCWT_no_fill" mtz_label_phases = "PH2FOFCWT_no_fill" likelihood_weighted = True obs_factor = 2 calc_factor = 1 kicked = False fill_missing_f_obs_with_weighted_f_model = False } Pavel Afonine wrote:
Hi Maia,
just a remark: probably the quickest and simplest option to see if your ligand is there is to do these two steps:
1) physically remove the ligand from PDB file; 2) compute Average Kick mFo-DFc Map using phenix.refine or phenix.maps.
Pavel.
I am wondering what is the difference between an omit map that is produced by the refinement without a ligand and an omit map produced by autobuild.
phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi Maia,
Hi Pavel, sorry to send you this letter again, as I did not get the answer.
I ran phenix.refine to produce a kicked map after deleting my ligand (to get an omit map). I could not make the Average Kick mFo-DFc Map using phenix.refine or phenix.maps.
I got an error message. Sorry: Duplicate mtz_label_amplitudes:2FOFCWT
Here is part of my .def file: How should I change it?
You have :
map { mtz_label_amplitudes = "2FOFCWT" mtz_label_phases = "PH2FOFCWT" likelihood_weighted = True obs_factor = 2 calc_factor = 1 kicked = True fill_missing_f_obs_with_weighted_f_model = True }
Replace it with (find two difference between two blocks of parameters: above and below), for example: map { mtz_label_amplitudes = "2FOFCWT_kick" mtz_label_phases = "PH2FOFCWT_kick" likelihood_weighted = True obs_factor = 2 calc_factor = 1 kicked = True fill_missing_f_obs_with_weighted_f_model = True } Again, you have:
map { mtz_label_amplitudes = "FOFCWT" mtz_label_phases = "PHFOFCWT" likelihood_weighted = True obs_factor = 1 calc_factor = 1 kicked = True fill_missing_f_obs_with_weighted_f_model = False }
Replace it with:
map { mtz_label_amplitudes = "FOFCWT_kicked" mtz_label_phases = "PHFOFCWT_kicked" likelihood_weighted = True obs_factor = 1 calc_factor = 1 kicked = True fill_missing_f_obs_with_weighted_f_model = False }
Good luck! Pavel.
On Wed, Apr 7, 2010 at 4:38 PM, r n
I wanted to do low resolution rigid body refinement with secondary structural restrain. I have not seen any command or information in the phenix document?
Okay, finally writing documentation now. Sorry! There isn't actually a method for combining rigid body refinement with secondary structure information yet; it would probably be a day or two of work for Pavel or me. However, the "morph" option on AutoBuild may do a better job at moving pieces of that size into place. (And there are probably other programs which I haven't used that do something similar.) You can use the secondary structure restraints with simple coordinate refinement (strategy individual_sites), by adding "main.secondary_structure_restraints=True" to the command-line (there is a corresponding button in the GUI, but only if you have the alpha-test mode turned on). Everything else is (in theory) automatic, but if you aren't using hydrogens, add "h_bond_restraints.substitute_n_for_h=True" as well, just to be safe. -Nat
it would probably be a day or two of work for Pavel or me.
Actually, in rigid body refinement atoms do not move w.r.t. each other - the rigid body groups move as a whole, so I don't understand why one may need to use secondary structure restraints. The only restraints that make sense to use are those that would prevent rigid bodies from running one into another, but this is definitely more than a couple of days of work (for me, at least) -:) Simple geometry non-bonded restraint term is a possibility and easy-to-add-now option but not a good idea: one or two ideally coinciding atoms may throw the whole rigid body miles away from its original position (because non-bonded term grows exponentially as atoms get closer to each other). Possibly better is to use overlap functions (computed based on calculated density map, for example - so-called "soft non-bonded restraints") - but this is a little project for someone who's ready to jump into it -:) Pavel.
Hi Todd, in general you can refine any selected part of your model using any combination of refinement strategies. The are a few exceptions, and Cartesian SA is one of them: it is applied to all atoms. It is in the list of things to address. SA refinement is good at early stages of refinement to correct gross errors, but it may do more harm than good for a relatively good structure. I think in your case, Rwork/Rfree = 0.2478/0.2849, you can safely continue without doing SA. Although what you can try is the new feature - the local real-space refinement ("fix_rotamers=true" option). This is very powerful tool for automatic correction of side chains (you need to have one of the latest PHENIX versions for this). More details here: http://cci.lbl.gov/~afonine/rsr.pdf Regarding the B-factors: yes, this seems logical - if some U atoms gets displaces out of their density peaks then the refinement will try to smear them out so they do not contribute to the Fcalc, and enormous increased B-factors is the way to do this. If you allow occupancy refinement, their occupancies will probably refine to zero or close. I'm not sure I understood this: are these atoms still moving out of density even if you do refinement without using SA? If this is the case then I do not know why this happens and I would really like to know. If you send me the data and model I might be able to figure this out. Let me now if you have any questions or problems! Pavel. On 4/7/10 1:47 PM, Green, Todd wrote:
Hello all,
I am refining a structure which contains several uranyl sites. I know the atoms should be there based on peaks in anomalous difference fourier maps, peak heights are greater than 10 sigma. Likewise we have solved the apo-structure before with the same heavy atoms. If I perform simulated annealing, the heteroatoms get pushed as far as 8 angstroms away. The Asp(s) and Glu residues that coordinate the uranyl ion then just move into the region of the map where the U has vacated. The resulting fofc difference map has a huge peak(>11 sigma) where the U should be. I noticed that if I inflate the starting b-factor for the U, the positions will stay put(relatively). What is the best thing to do here? Can I fix their positions, so that this does not happen? How do I do it? I found a post in the archives that says i can use: refine.sites.individual="not element U". Will this work for the SA? I know it works during refinement. Anyway, the data is to 3 angstrom. Here is the result from some SA runs:
following simulated annealing: no u, starting b-factor for all atoms 20: Final R-work = 0.2502, R-free = 0.2899 with u starting b-factor 20: Final R-work = 0.2771, R-free = 0.3094 with u inflated starting b-factor for U: Final R-work = 0.2682, R-free = 0.2983 After further manual rebuilding and TLS refinement(with u): Final R-work = 0.2478, R-free = 0.2849
Also, during further refinement(if I reposition the U(s)), some of the U(s) still get displaced and the b-factors get extremely high, some as high as 500. I think this is due to them being displaced though. Other datasets which I have refined with phenix don't exhibit this behavior. Could it have something to do with NCS restraints on the protein?
any suggests?
Thanks in advance- Todd
------------------------------------------------------------------------
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi, I'm not sure if I understand the initial question from Peter correct, but does not Buster-TNT take into account that one has an incomplete model and creates some additional mask which overlaps to some extent with the bulk solvent mask, when it calculates the structure factors? They write this may be helpful for low resolution data and incomplete structures. Christian Am Freitag 02 April 2010 17:25:00 schrieb Pavel Afonine:
Hi Ed,
It may be important to those who would like to experiment with masks but don't have time or interest in creating their own crystal structure refinement software.
True.
For instance, long time ago I, like many others before and after, have discovered that sometimes there are empty cavities inside proteins, which get filled with bulk solvent, resulting in negative density blobs
Not that long ago this problem was brought up by Dirk and Morten, and a few more people.
Now, I can suggest this as a "better way of mask calculation" (and who knows, maybe it's already implemented although I can't find anything suggesting it on the website). Expectation is that you have to be convinced that it works at least in some cases. Catch-22: suggestions will be implemented only if evidence is provided to demonstrate that suggested changes in the algorithm have some effect. Which can't be done without implementing the suggestion.
Yes, it's another item in the list of things to do. It just has lower priority and so will not appear tomorrow.
Ultimately, this question tests what is the philosophy of phenix as the software. Specifically, if software users are encouraged to experiment with possible improvements, or such activities are exclusive to developer team.
It's for a broad range of users: from experimenters (this is why phenix.refine has more than 300 parameters available to play with), to newbies that can just run "phenix.refine model.pdb data.mtz".
Plus, if you have some knowledge of Python and crystallography methods, you can write your own refinement program using cctbx and mmtbx librarier - just use them as Lego constructor. And don't forget to experiment along the way.
We have our own endless list of interesting things to implement/try out, plus we get great ideas from the user community. If we spend our developer's time on doing them all and now, we will never get anywhere. So we have to be selective and work only on those that are the most important. What's "important" is defined by the amount of user requests or by our own considerations.
Pavel.
_______________________________________________ phenixbb mailing list [email protected] http://phenix-online.org/mailman/listinfo/phenixbb
Hi Ed,
For instance, long time ago I, like many others before and after, have discovered that sometimes there are empty cavities inside proteins, which get filled with bulk solvent, resulting in negative density blobs. Not a big deal, really, but it was possible to appease the inner nitpicker by generating a mask in CNS, running it through USF's MAMA to remove isolated islands and supplying it to CNS refinement as an external mask. I haven't seen much change in R-factors (naturally), but the negative density was gone.
do you still have that model and data available so I can quickly reproduce these negative density blobs? I would be happy to do this tonight (instead of having Friday's evening beer). That would be a starting point for me to think how we can do it better. I have some data from other people but I would like to have some more cases in my hands before I start doing anything. Thanks! Pavel.
On Fri, 2010-04-02 at 10:53 -0700, Pavel Afonine wrote:
do you still have that model and data available so I can quickly reproduce these negative density blobs?
not really - this has been many years ago. I can unearth the data and model, sure, I just definitely do not remember which model that was and where in the structure the blobs were. Good to know you have other examples on hand, and if I encounter more, I'll send them your way. -- Edwin Pozharski, PhD, Assistant Professor University of Maryland, Baltimore ---------------------------------------------- When the Way is forgotten duty and justice appear; Then knowledge and wisdom are born along with hypocrisy. When harmonious relationships dissolve then respect and devotion arise; When a nation falls to chaos then loyalty and patriotism are born. ------------------------------ / Lao Tse /
participants (14)
-
Alexandre Urzhumtsev
-
Christian Roth
-
Christina Bourne
-
Ed Pozharski
-
Green, Todd
-
Jason C Porta
-
Maia Cherney
-
Nathaniel Echols
-
Pavel Afonine
-
Peter Grey
-
r n
-
Ralf W. Grosse-Kunstleve
-
Thomas C. Terwilliger
-
vincent Chaptal