I'm interested in building tools to improve this, so I'd really appreciate hearing what you all think are the biggest challenges in this area. And if there are existing tools that you consider "secret weapons" for doing better at science, I'd love to hear about those as well.
From there you can recursively search through the bibliography of the seminal works and the works that cite the seminal work to build a research map.
When researching different fields you often end up finding a) who the top researchers are and then you want to go read all of their stuff. b) who is currently working on the thing in $current_year who you might want to contact and talk to.
for example, when it comes to internal combustion engine research, Heywood is the man: https://scholar.google.co.uk/scholar?hl=en&as_sdt=0%2C5&q=JB...
(most cutting edge research is locked away in the automotive company's sadly).
Or in computational fluid dynamics the 'entry point' to the field is basically JD Anderson.
In both cases you're like 6 degrees of separation away from the cutting edge in several micro topics of active research.
> synthesize them into a coherent mental model to inform your own research
For the mental model there's no real way around sitting and reading a bunch of papers, I basically taught myself how to read papers efficiently and then read papers every day (often dead ends which can be quickly discounted.)
Yes, this would be great. Would also prefer if there's a way to trace back to the "origin" papers from which other papers build upon, and if you don't understand a term or concept, can search papers that explain it better.
1) A better (cheaper) way to access them. It doesn't necessarily have to be free as in SciHub, but there's no way I'm going to pay $80 as an individual to read one paper.
2) An easy way to summarize them, ask questions of it, etc. Google's NotebookLM (https://notebooklm.google.com/) is actually decent at this... upload a PDF and you can ask it questions about that content with minimal hallucination and citations back to the source. However, it's buggy (some files just never finish loading, others won't accept any prompt at all). And it's probably another short-lived experiment soon to meet the Google Graveyard :(
I would be willing to pay maybe $10-$20/mo for a service that can do both (provide Netflix-like access to papers, and also use LLM to summarize them and answer questions). Bonus points if it can do its own meta-analysis of multiple related papers and easily summarize them.
I suspect journal publishers would be heavily resistant to any of that. Probably a more technical workaround would be a web browser extension that uses public/school library logins to fetch papers from the clientside and then mirror them into the service. There is something like this in the legal world, https://free.law/recap to bypass access fees. But there's no copyright concerns there (since the documents themselves are public domain works of the federal gov, different from scientific papers).