首页/数据分析/web-research
W

web-research

by @langchain-aiv1.0.0
0.0(0)

制定并保存研究计划,将所有研究文件组织到专用文件夹中。

Web ScrapingData CollectionInformation RetrievalOSINTData AnalysisGitHub
安装方式
npx skills add langchain-ai/deepagents --skill web-research
compare_arrows

Before / After 效果对比

1
使用前

在进行网络研究时,如果没有明确的计划和组织,往往会导致信息过载、重复搜索、难以筛选有效信息,最终研究成果质量不高且耗时。

使用后

通过网络研究技能,可以系统地创建研究计划,将所有研究文件组织在专用文件夹中,并对研究问题进行分解,从而提高研究效率,确保获得高质量、有条理的研究成果。

description SKILL.md

web-research

Web Research Skill

Research Process

Step 1: Create and Save Research Plan

Before delegating to subagents, you MUST:

Create a research folder - Organize all research files in a dedicated folder relative to the current working directory:

mkdir research_[topic_name]

This keeps files organized and prevents clutter in the working directory.

Analyze the research question - Break it down into distinct, non-overlapping subtopics

Write a research plan file - Use the write_file tool to create research_[topic_name]/research_plan.md containing:

The main research question

  • 2-5 specific subtopics to investigate

  • Expected information from each subtopic

  • How results will be synthesized

Planning Guidelines:

  • Simple fact-finding: 1-2 subtopics

  • Comparative analysis: 1 subtopic per comparison element (max 3)

  • Complex investigations: 3-5 subtopics

Step 2: Delegate to Research Subagents

For each subtopic in your plan:

Use the task tool to spawn a research subagent with:

Clear, specific research question (no acronyms)

  • Instructions to write findings to a file: research_[topic_name]/findings_[subtopic].md

  • Budget: 3-5 web searches maximum

Run up to 3 subagents in parallel for efficient research

Subagent Instructions Template:

Research [SPECIFIC TOPIC]. Use the web_search tool to gather information.
After completing your research, use write_file to save your findings to research_[topic_name]/findings_[subtopic].md.
Include key facts, relevant quotes, and source URLs.
Use 3-5 web searches maximum.

Step 3: Synthesize Findings

After all subagents complete:

Review the findings files that were saved locally:

First run list_files research_[topic_name] to see what files were created

  • Then use read_file with the file paths (e.g., research_[topic_name]/findings_*.md)

  • Important: Use read_file for LOCAL files only, not URLs

Synthesize the information - Create a comprehensive response that:

Directly answers the original question

  • Integrates insights from all subtopics

  • Cites specific sources with URLs (from the findings files)

  • Identifies any gaps or limitations

Write final report (optional) - Use write_file to create research_[topic_name]/research_report.md if requested

Note: If you need to fetch additional information from URLs, use the fetch_url tool, not read_file.

Best Practices

  • Plan before delegating - Always write research_plan.md first

  • Clear subtopics - Ensure each subagent has distinct, non-overlapping scope

  • File-based communication - Have subagents save findings to files, not return them directly

  • Systematic synthesis - Read all findings files before creating final response

  • Stop appropriately - Don't over-research; 3-5 searches per subtopic is usually sufficient

Weekly Installs881Repositorylangchain-ai/deepagentsGitHub Stars14.3KFirst SeenJan 22, 2026Security AuditsGen Agent Trust HubPassSocketPassSnykWarnInstalled onopencode792codex769gemini-cli764github-copilot719cursor711kimi-cli664

forum用户评价 (0)

发表评价

效果
易用性
文档
兼容性

暂无评价,来写第一条吧

统计数据

安装量0
评分0.0 / 5.0
版本1.0.0
更新日期2026年3月17日
对比案例1 组

用户评分

0.0(0)
5
0%
4
0%
3
0%
2
0%
1
0%

为此 Skill 评分

0.0

兼容平台

🔧Claude Code
🔧OpenClaw
🔧OpenCode
🔧Codex
🔧Gemini CLI
🔧GitHub Copilot
🔧Amp
🔧Kimi CLI

时间线

创建2026年3月17日
最后更新2026年3月17日