Skip to content

docs(readme): expand MCP server setup instructions#170

Open
s4steve wants to merge 5 commits intoteng-lin:mainfrom
s4steve:main
Open

docs(readme): expand MCP server setup instructions#170
s4steve wants to merge 5 commits intoteng-lin:mainfrom
s4steve:main

Conversation

@s4steve
Copy link

@s4steve s4steve commented Mar 9, 2026

Summary

  • Add authentication prerequisite (must run notebooklm login before starting the server)
  • Add API key security docs (--api-key flag and NOTEBOOKLM_MCP_API_KEY env var)
  • Add host binding guidance (default 127.0.0.1 vs network-exposed 0.0.0.0)
  • Add Claude Desktop / settings.json JSON config example alongside the CLI command
  • Add step-by-step Docker setup (login first, clone, .env for API key)
  • Add environment variable reference table (NOTEBOOKLM_MCP_API_KEY, NOTEBOOKLM_DOWNLOAD_DIR)
  • Fix tool count: 27 → 26 (matches actual registered tools)

Test plan

  • Verify all code blocks are syntactically correct
  • Verify Docker steps work end-to-end (login → compose up)
  • Verify claude mcp add command with --header flag works with an API key

🤖 Generated with Claude Code

s4steve and others added 5 commits March 8, 2026 14:28
Adds a full MCP (Model Context Protocol) SSE server exposing 27 tools
covering notebooks, sources, chat, artifacts, and notes APIs.

- `src/notebooklm/mcp_server.py`: MCP server implementation
- `pyproject.toml`: add `mcp` optional dep group + `notebooklm-mcp` entry point
- `Dockerfile`, `docker-compose.yml`, `.dockerignore`: containerization support

Install with: `pip install 'notebooklm-py[mcp]'`
Run with: `notebooklm-mcp --host 0.0.0.0 --port 8765`

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…s mode

- Remove notebooklm-py.png (not needed in repo)
- Replace updated_at with is_owner in notebook dict serialization
- Add stateless=True to MCP server run() for proper SSE behavior

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Update "Three Ways to Use" to "Four Ways to Use" with MCP Server entry
- Add MCP Server quick start section with CLI and Docker usage
- Add mcp and all install options to Installation section

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…cker

- [Critical] Add optional API key auth middleware (--api-key /
  NOTEBOOKLM_MCP_API_KEY); returns HTTP 401 without valid Bearer token;
  warns at startup if unconfigured
- [High] Change default --host from 0.0.0.0 to 127.0.0.1; Dockerfile
  CMD retains explicit 0.0.0.0 for container use
- [High] Fix _wait_and_download_media bug: file_path now only set on
  successful download via dispatch dict, not unconditionally
- [Medium] Add _sanitize_text_input() helper; apply to all free-text
  inputs (instructions, custom_prompt, title, question, content) to
  prevent prompt injection
- [Medium] Add ensure_ascii=False to all json.dumps() calls
- [Medium] Refactor _dispatch_tool into per-tool _handle_* functions
  with a dispatch dict
- [Medium] Move import json, ReportFormat, ArtifactType to top-level
- [Medium] Dockerfile: replace COPY . . with selective COPY; add
  non-root appuser before CMD
- [Medium] .dockerignore: exclude storage_state.json and .notebooklm/

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Add authentication prerequisite, API key security, host binding
guidance, Claude Desktop JSON config, Docker setup steps, and
environment variable reference table. Fix tool count (26, not 27).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@gemini-code-assist
Copy link

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly expands the integration capabilities of NotebookLM by introducing a Model Context Protocol (MCP) server. This new server allows any MCP-compatible client, such as Claude Desktop or Claude Code, to seamlessly interact with NotebookLM's functionalities. The changes include detailed setup instructions, enhanced security features like API key authentication, and full Docker support, making it easier for developers to deploy and utilize NotebookLM in various environments.

Highlights

  • MCP Server Introduction: Introduced a Model Context Protocol (MCP) server to expose NotebookLM capabilities, allowing integration with external MCP-compatible clients like Claude Desktop or Claude Code.
  • Comprehensive Documentation: Provided extensive documentation in README.md covering MCP server setup, authentication prerequisites, API key security, host binding guidance, and client connection examples.
  • Docker Support: Added full Docker support with new Dockerfile and docker-compose.yml files for easy containerized deployment of the MCP server, including volume mounts for authentication and downloads.
  • New Dependencies and Entry Point: Updated pyproject.toml to include mcp[sse], uvicorn, and starlette as optional dependencies, and registered notebooklm-mcp as a new console script.
  • MCP Tool Implementation: Implemented 26 distinct MCP tools within src/notebooklm/mcp_server.py for managing NotebookLM notebooks, sources, chat interactions, artifact generation, and notes.
  • Tool Count Correction: Corrected the reported number of exposed MCP tools in the documentation from 27 to 26 to match the actual implemented tools.
Changelog
  • .dockerignore
    • Added common Python and Git ignore patterns, along with NotebookLM-specific files, to exclude them from the Docker build context.
  • Dockerfile
    • Added a Dockerfile to create a container image for the NotebookLM MCP server, specifying Python version, working directory, dependencies, and default command.
  • README.md
    • Updated the "Ways to Use" section to include the MCP Server.
    • Added new installation options for notebooklm-py[mcp].
    • Provided extensive documentation for MCP server setup, client connection, Docker deployment, and configuration.
    • Corrected the reported number of exposed MCP tools from 27 to 26.
  • docker-compose.yml
    • Added a Docker Compose configuration to define and run the notebooklm-mcp service, including port mapping and volume mounts for authentication and downloads.
  • pyproject.toml
    • Updated project dependencies by adding an mcp optional dependency group with mcp[sse], uvicorn, and starlette.
    • Registered notebooklm-mcp as a new console script entry point.
  • src/notebooklm/mcp_server.py
    • Added a new Python module that implements the core logic for the NotebookLM MCP server.
    • Defined and dispatched 26 tools for various NotebookLM operations, including notebook, source, chat, artifact, and note management.
    • Implemented client initialization, API key authentication middleware, and integration with Starlette and Uvicorn for the SSE transport.
Activity
  • No specific activity from reviewers or comments was provided in the context.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces an MCP server with Docker support, exposing library functionality as tools for MCP-compatible clients. A critical security vulnerability has been identified: the server runs without mandatory authentication by default and binds to all network interfaces in the Docker configuration. This could allow unauthenticated attackers on the same network to gain full access to the user's NotebookLM account and sensitive data. Additionally, there's a critical bug in the generate_mind_map tool handler and two medium-severity bugs in media and report download helpers that could lead to incorrect artifact downloads.

Comment on lines +783 to +787
status = await client.artifacts.generate_mind_map(args["notebook_id"])
await client.artifacts.wait_for_completion(args["notebook_id"], status.task_id)
artifact = await client.artifacts.get(args["notebook_id"], status.task_id)
return json.dumps(
_artifact_to_dict(artifact) if artifact else {"task_id": status.task_id},

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The generate_mind_map method in the client does not return a GenerationStatus object with a task_id. Instead, it directly returns a dictionary containing the mind map data and the note_id of the created mind map artifact. The current implementation incorrectly tries to access status.task_id, which will cause a runtime error. The logic should be updated to use the note_id from the result to fetch the artifact. The call to wait_for_completion is also unnecessary and incorrect for mind map generation.

Suggested change
status = await client.artifacts.generate_mind_map(args["notebook_id"])
await client.artifacts.wait_for_completion(args["notebook_id"], status.task_id)
artifact = await client.artifacts.get(args["notebook_id"], status.task_id)
return json.dumps(
_artifact_to_dict(artifact) if artifact else {"task_id": status.task_id},
result = await client.artifacts.generate_mind_map(args["notebook_id"])
note_id = result.get("note_id")
if not note_id:
raise RuntimeError("Failed to generate mind map: no note_id returned.")
artifact = await client.artifacts.get(args["notebook_id"], note_id)
return json.dumps(
_artifact_to_dict(artifact) if artifact else {"task_id": note_id},

Comment on lines +886 to +890
if not api_key:
logger.warning(
"No API key configured. Anyone with network access can use this server. "
"Set --api-key or NOTEBOOKLM_MCP_API_KEY to restrict access."
)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

security-high high

The MCP server does not require an API key by default, only issuing a warning if one is missing. When deployed via the provided Docker configuration (which binds to 0.0.0.0), this exposes the user's Google session cookies (loaded from storage_state.json) to anyone on the network. An unauthenticated attacker could gain full access to the user's NotebookLM account, including reading and deleting notebooks. Authentication should be mandatory when the server is bound to a non-loopback interface.

Suggested change
if not api_key:
logger.warning(
"No API key configured. Anyone with network access can use this server. "
"Set --api-key or NOTEBOOKLM_MCP_API_KEY to restrict access."
)
if host != "127.0.0.1" and not api_key:
raise ValueError("API key is required when binding to a non-loopback interface. Set --api-key or NOTEBOOKLM_MCP_API_KEY.")
if not api_key:
logger.warning(
"No API key configured. Only loopback access is safe. "
"Set --api-key or NOTEBOOKLM_MCP_API_KEY to restrict access."
)


EXPOSE 8765

CMD ["notebooklm-mcp", "--host", "0.0.0.0", "--port", "8765"]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

security-medium medium

Binding to 0.0.0.0 by default in the Docker container exposes the application to the network. In this application, which handles sensitive Google session cookies and has optional authentication, this default configuration is insecure. It is safer to bind to 127.0.0.1 by default or ensure the application enforces authentication when exposed.

download_func = download_methods.get(kind)
if download_func:
try:
await download_func(notebook_id, str(output_path))

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The download functions for media artifacts (download_audio, download_video, etc.) are called without specifying the artifact_id. When multiple artifacts of the same type exist, this could lead to downloading the wrong file (the latest one by default) instead of the one that was just generated. The artifact_id of the newly completed artifact is available and should be passed to ensure the correct file is downloaded.

Suggested change
await download_func(notebook_id, str(output_path))
await download_func(notebook_id, str(output_path), artifact_id=artifact_id)

result["file_path"] = str(out_path)
elif artifact.kind == ArtifactType.REPORT:
out_path = download_dir / f"report_{artifact_id}.md"
await client.artifacts.download_report(notebook_id, str(out_path))

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The download_report function is called without specifying the artifact_id. When multiple reports exist in a notebook, this could lead to downloading the content of the wrong report (the latest one by default) instead of the one that was just generated. The artifact_id is available and should be passed to ensure the correct report content is retrieved.

Suggested change
await client.artifacts.download_report(notebook_id, str(out_path))
await client.artifacts.download_report(notebook_id, str(out_path), artifact_id=artifact_id)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant