It is unusual how press-conference-quality-science often doesn’t meet the qualifications of undergraduate-quality-science. Examples of this include Ida, arsenic DNA, vaccines cause autism; I could go on, but you get the idea. All of these have several things in common, many things wrong, and interesting lessons. These are important for many reasons, but mostly because, even with bad science, we can learn from them.
Ida-that delightful little Darwinius masillae fossil from the Messel pit in the southwest corner of the state of Hesse in Germany-was a gorgeous fossil, and could have been even more remarkable had it been properly studied without trying shoehorn this little fossil as a transitional form between Strepsirrhini and Haplorhini by only comparing her to living primates. You cannot use paleontology to prove a point while simultaneously ignoring all of the other paleontological finds.
The autism-vaccine crap utilized something else entirely, several somethings to be more precise. It relied on a small uncontrolled study (12 data points) to draw sweeping conclusions. Data were manipulated to show a stronger correlation. Some of the children had medically documented problems BEFORE they received the vaccines while others did not show any problems even after being vaccinated (except in the paper). So these data were very well cooked and manipulated to fit the conclusions Wakefield (and/or the lawyers that had been paying him) wanted.
The arsenic DNA debacle is the most recent of these cases of highly publicized bad science. If you look closely enough, though, you can find a number of similarities between the “science” of all of them.
1) There’s always a press conference in really bad science, not like “MHC effects mate selection” bad–which was pretty bad, but seems to be an honest mistake, but “not even wrong” bad or ignoring/misrepresenting previous research.
2) The paper is always released at the same time as said press conference leaving no time for review.
3) The data/methods/conflicts of interest are not fully disclosed
There are a few other things that these cases have in common:
1) Discrepancies between the press conference and paper
2) Overly simple explanations of the paper during the press conference
3) The press conference will often present wild implications of the scientific impact of the paper.
If all of these high profile press conferences continue to be facepalm inducing botches, it will continue to diminish the public perception of science. So, as someone educated but (unfortunately and not by choice) not active in the scientific community, go where the research leads, not where you want the data to lead.