How Taubes himself describes his research strategy, in his own words, from GCBC:
"My background is as a journalist with scientific training in college and graduate school. Since 1984, my journalistic endeavors have focused on controversial science and the excruciating difficulties of getting the right answer in any scientific pursuit. More often than not, I have chronicled the misfortunes of researchers who have come upon the wrong answer and found reason, sooner or later, to regret it. I began reporting on public-health and medical issues in the early 1990s, when I realized that the research in these critically important disciplines often failed to live up to the strict standards necessary to establish reliable knowledge. In a series of lengthy articles written for the journal Science, I then developed the approach to the convention wisdom of public-health recommendations that I applied in this book.
It begins with the obvious question: what is the evidence to support the current beliefs? To answer this question, I find the point in time when the conventional wisdom was still widely considered controversial -- the 1970s, for example, in the case of the dietary-fat/cholesterol hypothesis of heart disease, or the 1930s for the overeating hypothesis of obesity. It is during such periods of controversy that researchers will be most meticulous in documenting the evidence to support their positions. I then obtain the journal articles, books, or conference reports cited in support of the competing propositions to see if they were interpreted critically and without bias. And I obtain the references cited by these earlier authors, working ever backward in time, and always asking the same questions: Did the investigators ignore evidence that might have refuted their preferred hypothesis? Did they pay attention to experimental details that might have thrown their preferred interpretation into doubt? I also search for other evidence in the scientific literature that wasn't included in these discussions but might have shed light on the validity of the competing hypotheses. And, finally, I follow the evidence forward in time from the point at which a consensus was reached to the present, to see whether these competing hypotheses were confirmed or refuted by further research. This process also includes interview with clinical investigators and public-health authorities, those still active in research and those retired, who might point me to research I might have missed or provide further information and details on experimental methods and interpretation of the evidence.
Throughout this process, I necessarily made judgments about the quality of the research and about the researchers themselves. I tried to do so using what I consider the fundamental requirement of good science: a relentless honesty in describing precisely what was done in any particular work, and a similar honesty in interpreting the results without distorting them to reflect preconceived opinions or personal preferences. "If science is to progress," as the Nobel Prize-winng physicist Richard Feynman wrote forty years ago, "what we need is the ability to experiment, honesty in reporting results -- the results must be reported without somebody saying what they would like the results to have been -- and finally -- an important thing -- the intelligence to interpret the results. An important point about this intelligence is that it should not be sure ahead of time qhat must be." This was the standard to which I held all relavent research and researchers. I hope that I, too, will be judged by the same standard."
Yes, this is all that I've been doing in my various GCBC Fact Check posts here. Unfortunately, the list of instances (and I have more to share) I have uncovered where you either ignored contrary evidence and/or flat-out misrepresented evidence in citations you did include in your book has become staggeringly long.