Very cool, and very relevant to my life -- I am currently writing a meta-analysis and finishing my literature search.
I gave it a version of my question, it asked me reasonable follow-ups, and we refined the search to:
I want to find randomized controlled trials published by December 2023, investigating interventions to reduce consumption of meat and animal products with control groups receiving no treatment, measuring direct consumption (self-reported outcomes are acceptable), with at least 25 subjects in treatment and control groups (or at least 10 clusters for cluster-assigned studies), and with outcomes measured at least one day after treatment begins.
I just got the results back: https://www.undermind.ai/query_app/display_one_search/e5d964....
It certainly didn't find everything in my dataset, but:
* the first result is in the dataset.
* The second one is a study I excluded for something buried deep in the text.
* The third is in our dataset.
* The fourth is excluded for something the machine should have caught (32 subjects in total), but perhaps I needed to clarify 25 subjects in treatment and control each.
* The fifth result is a protocol for the study in result 3, so a more sophisticated search would have identified that these were related.
* The sixth study was entirely new to me, and though it didn't qualify because of the way the control group received some aspect of treatment, it's still something that my existing search processes missed, so right away I see real value.
So, overall, I am impressed, and I can easily imagine my lab paying for this. It would have to advance substantially before it was my only search method for a meta-analysis -- it seems to have missed a lot of the gray literature, particularly those studies published on animal advocacy websites -- but that's a much higher bar than I need for it to be part of my research toolkit.
For a systematic review/meta analysis you’d be expected to document your search strategy, exclusion criteria, etc anyway wouldn’t you? That’d preclude using a tool like this other than as a sense check to see if you needed to add more keywords/expand your search criteria anyway.
My wife does that for her day job (in the U.K. national healthcare system) and the systematic reviews have to be super well documented and even pre-registered on a system called PROSPERO. The published papers always have the full search strategy at the end.
I was planning to say "I used an AI search tool" and cite undermind.ai in my methods section. I think that won't raise any eyebrows in the review process but we'll see.
Have a look at the PRISMA reporting guidelines
For a meta-analysis, you might want to try the "extend" feature. It sends the agent to gather more papers (we only analyze 100 carefully initially), so if your report might say "only 55% discovered", could be useful.
(Also, if you want, you can share your report URL here, others will be able to take a look.)
Thanks, I added my URL
I only have some experience writing normal papers, so just out of interest, could you elaborate what your usual search routine for a meta-analysis is?
There's a whole established process for this, see here for a textbook chapter https://training.cochrane.org/handbook/current/chapter-04
However, because I'm writing a methods-focused review -- we only look at RCTs meeting certain (pretty minimal) criteria relating to statistical power and measurement validity -- what I'm doing is closer to a combination of review of previous reviews (there have been dozens in my field) and a snowball search (searching bibliographies of papers that are relevant). I also consulted with experts in the field. however, finding bachelor's theses has been challenging, but many are actually relevant, so undermind was helpful there.