A post over at Uncertain Principles got me thinking about "library" research. What I mean by library research is really literature research, but it's called library research in the post, so I figured I'd keep the same nomenclature. "Library" is in quotes because somehow that seems to convey the sentiment that, in doing literature research, one does everything possible to avoid having to actually, physically, go to the library. And when one finds that the physical library is the only way to access certain information, one often starts to seriously reconsider if that information is all that essential. (Even worse if one finds the information must be ordered from another library, thus creating a delay in information access, in addition to having to physically go to the library.) But I digress.
What I really wanted to say about library research is that I don't understand why I was never taught how to do it correctly. I mean, how hard is it to explain that there are these things called "peer-reviewed journals" and they contain the acceptable articles to reference? And even more explicitly, when opening yourself up to a new research topic, there are things called "review articles" that are usually the best way to get started. There you go, library research, taught in two sentences.
But wait, there are some potential problems with that simplification. Firstly, there are books, which are usually not formally peer-reviewed. And I suspect books play a much more important role in humanities and social science research than in STEM research. Secondly, how does one find these peer-reviewed journals and indentify them from the rest of the vast internet of information? Is there a generally applicable solution to these problems?
For books, the answer seems to be, if it's accessible in the library, it's probably legit enough to reference. Obviously, that answer is more for undergrads doing class projects rather than researchers looking for answers. When you're really looking for answers (rather than just trying to make an A), it seems you intuitively acquire a more refined b-s detector. Ideally, you use the information that appears legitimate and useful, you reject information only if you have a legitimate reason to do so, and you file away questionable information for later. Non-ideally, you just believe what supports your thesis and reject what does not. Or cite the crap and cite what's wrong with it. Whichever.
As for how you find and identify peer-reviewed journals? Well, I do my literature searches in Pub-Med and on ISI Web of Knowledge/Science, and that works for me. And I look up articles cited by other articles in peer-reviewed journals. And that keeps me on track. But what of other fields? I've heard Google Scholar is nice, but I've never used it myself. Are there databases for humanities and social science? I also wonder if there is a statement on peer-reviewed journals' websites or journal hard-copies that states that these journals are peer reviewed? Maybe the best bet is to ask someone in the general field how they do their literature searches.
At least, that's how I learned about this stuff. I learned about the databases I use in graduate school, simply as offhand information offered in converstaion with profs and other students. That's also how I learned about the distinction between peer-reviewed articles and the general internet-information rif-raf. And when I typed some keywords from my thesis topic into those databases and was overwhelmed with the 1000+ articles, I asked my advisor, how the heck do I get a handle on all this literature? And he told me the common sense thing, which was start with the most recent review article and work backwards into the interesting references. But why didn't I learn any of this stuff earlier?
I thought part of the answer to why I didn't learn good literature research techniques earlier was that the internet was still debuting when I was an undergrad (1998-2002). But turns out PubMed has been around since 1996. ISI Web of Knowledge did launch in 2002, so at least my excuse is valid there. (What did non-bio-x scientists do before ISI?) So in reality, online access to journals was still developing when I was an undergrad, and I was being taught cutting edge research techniques in my 1998 college orientation class when I was taught to use the computer-based card-catalog at my college library. I guess the internet really changed things.
Another way the internet changed things? Pre-internet, it was a lot harder to even find information in any source other than peer-reviewed journals and library accessible books. So, I guess I can't place too much blame on my college educators for why I didn't learn to do "library" research the way it should be done in 2010.
The moral of the story? If you're trying to do legitimate research in a field, ask someone in the field how they get started. In science, ISI Web of Knowledge and PubMed (for bio-anything) are the best starting points. Start with the most recent review articles if you can find them, and work backwards into the references. Once you have a handle on the subject, you can generate more specific search terms and can scan other titles to find interesting articles. And that's how "library" research is done. At least in 2010.
Thursday, July 29, 2010
Tuesday, July 6, 2010
Fair's fair?
I just followed a trail of bread crumbs about science and graduate school, beginning with Ph.D. training as a Ponzi scheme, meandering through The Real Science Gap, what's socially wrong with science, Something Deeply Wrong with Chemistry, something's wrong with this lab and it's not atypical, and finally jumping off that miserable, depressing trail to end with the upbeat, (and totally naive) Drawings of Scientists by 7th graders. By the end, when I read what the 7th graders wrote about the scientists being normal, happy people who lead normal, happy lives, I really wanted to cry (and also, tell those poor 7th graders "Don't believe it!").
Sigh...
What does it mean?
First, it's depressing.
But, then, I also note in the comments a few statements like "Sounds like the video game industry," or "Sounds like computer programming."
Is this just the way it is? The way it has to be? Is this as fair as it's ever going to get? What's fair? And for that matter, what's best?
Is it fair (or best) that young people (20-40 years old) work very hard for low pay in order to keep technology chugging along at a reasonable pace? Our hard work and low pay makes far more discovery and productivity possible than if we worked fewer hours and demanded higher pay. Right?
Is it fair (or best) that only a small percentage of us can make it to slightly cushier jobs, with some amount of financial payoff, job security and benefits? Surely that's the only way to ensure that only the cream-of-the-crop rise to the top and make the big, high-level decisions about what lines of research to pursue and how to spend the precious research dollars. Right?
And what of the remaining bright, inquisitive, scientific minds that refuse to live that life for long enough to rise to the top, or get a little unlucky somewhere along the way, or just don't make it into the top few percent? Is it right (or best) that they end up un- or under-employed, or working their entire lives like the 20-40 yr old "youngsters" with low pay, no security, few benefits, and ridiculous hours?
Or is it possible that a better way could be found? Could there be a system where bright scientific minds could find financial reward, job security, benefits and reasonable hours as well as intellecutally stimulating, rewarding jobs that also better maximize their contributions, and maybe that they could find these jobs at as early of an age as, say, 22? Is that what you get when you skip the Ph.D. and just join industry right out of college? Or do you get more of the same - high expectations for low pay and some distant possibility of moving up to financial reward, security, and the possibility of intellectual contribution?
Sigh...
What does it mean?
First, it's depressing.
But, then, I also note in the comments a few statements like "Sounds like the video game industry," or "Sounds like computer programming."
Is this just the way it is? The way it has to be? Is this as fair as it's ever going to get? What's fair? And for that matter, what's best?
Is it fair (or best) that young people (20-40 years old) work very hard for low pay in order to keep technology chugging along at a reasonable pace? Our hard work and low pay makes far more discovery and productivity possible than if we worked fewer hours and demanded higher pay. Right?
Is it fair (or best) that only a small percentage of us can make it to slightly cushier jobs, with some amount of financial payoff, job security and benefits? Surely that's the only way to ensure that only the cream-of-the-crop rise to the top and make the big, high-level decisions about what lines of research to pursue and how to spend the precious research dollars. Right?
And what of the remaining bright, inquisitive, scientific minds that refuse to live that life for long enough to rise to the top, or get a little unlucky somewhere along the way, or just don't make it into the top few percent? Is it right (or best) that they end up un- or under-employed, or working their entire lives like the 20-40 yr old "youngsters" with low pay, no security, few benefits, and ridiculous hours?
Or is it possible that a better way could be found? Could there be a system where bright scientific minds could find financial reward, job security, benefits and reasonable hours as well as intellecutally stimulating, rewarding jobs that also better maximize their contributions, and maybe that they could find these jobs at as early of an age as, say, 22? Is that what you get when you skip the Ph.D. and just join industry right out of college? Or do you get more of the same - high expectations for low pay and some distant possibility of moving up to financial reward, security, and the possibility of intellectual contribution?
Subscribe to:
Posts (Atom)