by Keller MB, Ryan ND, Strober M, Klein RG, Kutcher SP, Birmaher B, Hagino OR, Koplewicz H, Carlson GA, Clarke GN, Emslie GJ, Feinberg D, Geller B, Kusumakar V, Papatheodorou G, Sack WH, Sweeney M, Wagner KD, Weller EB, Winters NC, Oakes R, and McCafferty JP.Journal of the American Academy of Child and Adolescent Psychiatry, 2001, 40[7]:762–772.
Objective: To compare paroxetine with placebo and imipramine with placebo for the treatment of adolescent depression.…Conclusions: Paroxetine is generally well tolerated and effective for major depression in adolescents.
-
PROTOCOL and SAP[Statistical Analysis Plan]: We talked about these documents in the last post – detailed ‘maps’ of how the study is to be conducted and analyzed.
-
CSR: The CLINICAL STUDY REPORT is an elaborate narrative write-up of the study, in this case, it’s filled with tables and graphs. It tells the story of the clinical trial in detail. And since this trial had two phases, there are two: Full study report acute [528 pages] and Full Study report continuation [264 pages]. In this case, the raw data [Appendices] was not released initially.
-
ARTICLE: This is the published article in a journal, abstracted above.
-
IPD: The INDIVIDUAL PARTICIPANT DATA is, in this case, 150 Megabytes of raw scores and other tabulations, increasing the mass of available information 100 fold! And bringing most of the trial out of the shadows.
When the RIAT Initiative [restoring invisible and abandoned trials] was launched in the summer of 2013, Study 329 was a prime candidate as so much of the data was already available, and there was no question that it had been abandoned. Not long after our team had formed, GSK announced that it was establishing a data portal, available to qualified groups who wanted to access the data from a previous trial [after 2007] for some further research project – generally known as Data Sharing. Access was contingent on being accepted by an independent panel of judges. Study 329 was conducted from 1994 until 1998 and published in 2001. We did not want to do a new research project. Instead, we wanted to reanalyze the raw data and potentially republish the study with a new analysis. And, as the figure above shows, we already had access to most of the information anyway. What remained?
When one does a Clinical Trial, there’s some form to fill out for every single interaction [emphasis on every] with the subject’s ID number and the date [the treatment is obviously not there in a blinded study]. By the end of things, each subject has amassed literally volumes of forms [the actual number depends on how long they stay around, how many adverse events they report, etc]. They’re called Case Report Forms [CRF], and there are plenty of them [50,000+ in Study 329]. The IPD [Individual Participant Data] is created by transcribing the CRFs into a tabularized [and more manageable] format. Why did we want them? We were specifically interested in checking the transcription of the Adverse Events from these raw forms into the tables we already had. The CRFs are the data [or at least as close to the real data as one can get].
-
CRF: The CASE REPORT FORMS are all of the forms filled out in the study along the way. They’re the snapshots by the people in direct contact with the subjects – the closest proxy to "being there."
Yes, but when does the study of study329 change the minds our prominent scientists?
http://www.lawyersandsettlements.com/articles/ssri/paxil-suicide-risk2-01961.html?utm_expid=3607522-8.uTwqV-N-RqmmAyO2kCf6lA.0&utm_referrer=https%3A%2F%2Fwww.google.com%2F#.VfEHB7SbIR
http://www.finance.senate.gov/:
A few links. Food for thought. How many replications will the Restored Study Report require before the tipping point is achieved ?
Or is it just a matter of who does the reporting and who reads the report? As Mina Dulcan (editor of AJCAP at the time of publication of study329), so diligently tried to teach Shelly Jofre ( investigative journalist, Panoram/BBC) during an interview ( transcript can be found study329.org see background )
; a few key statements by Ms Dulcan:
” Science isn’t that simple”
“Well, we would love to have pure science, but you know what? There is no such thing as it turns out.”
“Well, again, science is complicated.”
Here’s how AJCAP editor, Dr. Mina Dulcan, answers Shelly’s last question: “So you don’t have any regrets at all?”(re: publishing , then not retracting study329)
MD: “I don’t, no, because it does what science does, which is, it puts something out there, people ask questions, more analysis is there, the Regulators look at all the data, they open things up, that’s how science works. The purpose of a scientific journal is not to tell people what to do.”
** I find: It is interesting that when a prominent scientist, doctor, editor, responds to a lay person’s astute criticisms of them, the point is driven home that the science is beyond the lay person’s comprehension–. When there is a strong case presented by a lay person, citing negative consequences of scientists misinforming the doctors who, then unknowingly put their patient’s in harms way– just because the doctors trusted the source of the information and all the rigors of the science that lay people cannot comprehend–there is a subtle change in the language. The professional journal that can only be understood amongst the scientific elite, and the clinicians who are the target audience –does not tell *people* what to do– ?? I wonder why the prestigious editor of the World’s leading professional journal for child/adolescent psychiatric *clinicians” couldn’t say *doctors/psychiatrists/professionals in the field of children’s mental health ??
Or, in the words of the scientific elite defending the publication of Study 329, maybe all the professional clinicians are just people– like you and me??
Mickey, I am very eager to see your article! There must be a book-worthy backstory to the efforts that went into it. I appreciate your description of the case report forms, their transformation into individual participant data, and that you were eventually able to access both. Does that mean you had opportunity to (a) accurately classify adverse events such as suicidal ideation/suicide attempts, and (b) directly compare your accurate classification to the adverse events data reported in the Keller et al. published version of study 329? Among the many reasons study 329 is notorious is that, as I recall, the authors “massaged” the adverse event coding to hide suicidal events associated with paroxetine. It would be an important contribution if you were able to provide direct evidence of such fraudulent reporting.
All of the brave number-crunchers deserve medals of recognition. Thank you, Mickey.