Abstract

Large language models (LLMs) can complete general scientific question-and-answer, yet they are constrained by their pretraining cut-off dates and lack the ability to provide specific, cited scientific knowledge. Here, we introduce Network for Knowledge Organization (NEKO), a workflow that uses LLM Qwen to extract knowledge through scientific literature text mining. When user inputs a keyword of interest, NEKO can generate knowledge graphs to link bioinformation entities and produce comprehensive summaries from PubMed search. NEKO significantly enhance LLM ability and has immediate applications in daily academic tasks such as education of young scientists, literature review, paper writing, experiment planning/troubleshooting, and new ideas/hypothesis generation. We exemplified this workflow's applicability through several case studies on yeast fermentation and cyanobacterial biorefinery. NEKO's output is more informative, specific, and actionable than GPT-4's zero-shot Q&A. NEKO offers flexible, lightweight local deployment options. NEKO democratizes artificial intelligence (AI) tools, making scientific foundation model more accessible to researchers without excessive computational power.

Original languageEnglish
Pages (from-to)60-67
Number of pages8
JournalMetabolic Engineering
Volume87
DOIs
StatePublished - Jan 2025

Keywords

  • Foundation model
  • Knowledge graph
  • Large language model
  • Qwen
  • Retrieval augmented generation

Fingerprint

Dive into the research topics of 'Network for knowledge Organization (NEKO): An AI knowledge mining workflow for synthetic biology research'. Together they form a unique fingerprint.

Cite this