Dear CCP4 user and experts, I refined (with PHENIX) a 3.0 A dataset obtaining Rfree of about 0.24 (with good geometry according to Ramachandran, Beta Outliers, etc.) .... Everything seems to be ok (especially in relation to the resolution) .... because the mtz.file I used was quite old and I cannot find my xscale.hkl file, I processed the data set again (this time with optimizng and polishing) and received a "better" file according to almost everything (resolution limit, I/sigma, CC(1/2), Rmeas), thus I used this new mtz.file and put it in my last refinement step (the refinement which led to the above mentioned Rfree = 0.24). Suprisingly, the refinement starts at Rwork = 0.18 and Rfree = 0.19 but ending up with 0.20 and 0.22, respectively. So I wanted to know if this is usual? I was expecting my data to become slightlly better but what is irritating me is the starting R-values of the refinement and that it get worse during refinement. Maybe I did something wrong? Is it reasonable to replace the mtz.file with a new one in the last refinement step or should I start the refinement from the scratch? In the sake of completeness, I deleted the header of the used pdb.file because of the R-flag error which occurs, since Phenix reconizes that the pdb.file was already used with other Rflags. Sorry, but I am still a beginner in this field, so I would be very grateful if somebody could explain me this situation and my mistake and if I need to start refinement from the beginning. Thank you in advance! Best Regards. Aleksandar -- ------------------------------------------- Aleksandar Bijelic, MSc. Institut für Biophysikalische Chemie Universität Wien Althanstrasse 14 A-1090 Wien Tel: +43 1 4277 52536 e-Mail: [email protected] --------------------------------------------