This NotebookLM + Claude Code Workflow Is Insane

EEric Tech
Computing/SoftwareAdvertising/MarketingSmall Business/StartupsJob SearchInternet Technology

Transcript

00:00:00In this video, I'm gonna show you how you can combine
00:00:01the power of Claw Code and NotebookLM here
00:00:04using this tool called NotebookLM-py,
00:00:07which is an open source library
00:00:09that you can be able to combine NotebookLM here
00:00:11into a COI tool that can be used by AI agents.
00:00:14Now, you might be wondering why we should use this
00:00:16is because Claw Code is really good at execution,
00:00:18but NotebookLM here, on the other hand,
00:00:20can turn your messy documentation, research,
00:00:22different sources into a clear, grounded understanding
00:00:26that we can pass it to Claw Code for execution.
00:00:28And take my use case as example
00:00:30that I'm gonna show you later on in this video.
00:00:32Simply, you can see, I was able to using
00:00:33the NotebookLM skill here instead of Claw Code
00:00:35and do a compare analysis for the product
00:00:38that I built called Book Zero.
00:00:39And we can see here that I have asked it to analyze
00:00:42the 35 competitors that we have inside of this CSV data
00:00:46and do a deep compare analysis
00:00:48for each competitors that we have.
00:00:49And furthermore, we can also use that as a knowledge base
00:00:51to decide exactly which product directions
00:00:54we should go for for any kind of use case, right?
00:00:56For example, based on the competitor research
00:00:59that we have done in the two notebooks that we have,
00:01:01we can then be able to answer questions like,
00:01:02what should we focus on next?
00:01:04And that is gonna look through all the Jira ticks
00:01:06that we have in our Jira board
00:01:08and be able to understand the current applications,
00:01:10combining with the knowledge base
00:01:11that we have for the competitor research.
00:01:13And furthermore, you can not only just use that
00:01:15for development here,
00:01:16you can also use that for making content as well.
00:01:18Right here, you can see these are all generated
00:01:20using Nano Banana 2, SEO skills,
00:01:22and also using the NotebookLM skill here,
00:01:24basically combining the knowledge base
00:01:26that we have for all the competitors
00:01:27to writing blog post content
00:01:29that we can compete for other competitors in the market.
00:01:32So you can see that these are all really practical use case
00:01:34that we can use the NotebookLM for instead of a cloud code.
00:01:37So with that being said,
00:01:38that's exactly what we're gonna cover in this video.
00:01:40And specifically, we're going to cover all the features
00:01:43that there is for the CLI,
00:01:44as well as how we're gonna install this
00:01:46onto our local machine,
00:01:47how we're gonna be able to set this up.
00:01:48And then furthermore,
00:01:49I'm gonna show you the NotebookLM skills here
00:01:52that I'm going to integrate into our AI agents.
00:01:55I'm gonna show you this all in this video.
00:01:57So with that being said, if you're interested,
00:01:58let's get into it.
00:01:59All right, so before we jump in,
00:02:00a quick intro for those who are new here.
00:02:02My name's Eric,
00:02:03and I spent years as a senior software engineer
00:02:05at companies like Amazon, AWS, and Microsoft.
00:02:08And I have started this channel
00:02:09to share everything that I learned along the way,
00:02:11from AI encoding to automations, Web3,
00:02:15career developments, and more,
00:02:17all broken down into practical tutorials
00:02:19that you can actually follow.
00:02:21And of course, we also have a school community
00:02:23where you can get access to all the resource, templates,
00:02:26plus our community supports.
00:02:27So if you're ready to level up,
00:02:29make sure to check out my YouTube channel
00:02:30and hit that subscribe button.
00:02:32Now let's get back to the video.
00:02:34All right, so to get started,
00:02:34first thing first we're gonna do here
00:02:35is to navigate to notebooklm-py.
00:02:38And I'll make sure to put this link
00:02:39for the repository here in the link description
00:02:41so that you can find it.
00:02:42And basically what this repository does
00:02:44is it contains all the NotebookLM skills,
00:02:46as well as the Python APIs and CLIs
00:02:49on how can people use Clockoh here or AI agents
00:02:52to programmatically access our NotebookLM's feature.
00:02:55And here you can see for this repository here,
00:02:57it contains all the complete features
00:02:59that NotebookLM here cover.
00:03:00For example, you can be able to create a notebook,
00:03:02list out notebooks, or be able to rename or delete.
00:03:05You can also be able to insert all the sources you want
00:03:07and be able to extract questions or conversation histories,
00:03:09as well as setting the persona here in the chats.
00:03:12And then we can also be able to set the research here
00:03:14to deep mode or fast mode with auto imports.
00:03:17And furthermore, you can also be able to download
00:03:19any things that you generated using NotebookLM,
00:03:21for example, audio, video, slide deck,
00:03:23all those kinds of things.
00:03:24You can also be able to extract them using this tool as well.
00:03:28So all the functionality that are covered on the web UI,
00:03:31you can do the same thing using the CLI as well.
00:03:33So in our case here, let's take a look
00:03:35at how we can be able to install this onto our local machine.
00:03:37So right here you can see it has the installation section,
00:03:40and simply we're just gonna install the basic installation
00:03:42plus the browser login support
00:03:44so that we can log into the first time in the browser
00:03:46and save that credential.
00:03:47So in this case, I'm just gonna copy this right here.
00:03:50And then here, I'm gonna head over to a new terminal section.
00:03:52And here you can see I have a folder
00:03:53called erictech-notebook-lm.
00:03:55And what I'm gonna do here is I'm gonna first create
00:03:57our virtual environment first.
00:03:59So in this case, this is the command for this.
00:04:01So once I create the virtual environment,
00:04:03I'm gonna activate this.
00:04:04So after I activate this,
00:04:06I'm going to paste the command for the installation now.
00:04:09All right, so now once we have this,
00:04:11then what we can do is to install this completely.
00:04:13And here you can see this is what the end result look like
00:04:15after we have installed this.
00:04:16And now we can also be able to verify to see
00:04:18if our notebook-lm CLI here is installed
00:04:21by checking the version.
00:04:22And currently you can see this is the version I'm using
00:04:24for the notebook-lm CLI.
00:04:26So the next thing we're gonna take a look at
00:04:28is how we can be able to authenticate ourself
00:04:29for our notebook-lm.
00:04:30And this is the quick video guide.
00:04:32And basically what you can do here
00:04:33is that you can just use this command right here
00:04:35to log in with the browser here.
00:04:37So now if I were to head over to terminal,
00:04:39and if I were to paste that command,
00:04:41it's gonna open a browser.
00:04:42And simply, we're just gonna sign in with Google here.
00:04:44And it'll basically authenticate ourself for our notebook-lm.
00:04:47So right here, you can see after I sign in,
00:04:49it's gonna save our credentials in our root directories.
00:04:51So now here you can see,
00:04:52once we have the CLI command here installed and connected,
00:04:54the next thing we're gonna do here
00:04:55is we can do all kinds of things,
00:04:56like creating notebook, chat with the resource that we have,
00:04:59or generating content and downloading artifacts,
00:05:01all kinds of things, right?
00:05:02These are all the CLI commands that we can use
00:05:05to do all kinds of things with our notebook-lm.
00:05:07But the most important thing we're gonna do here
00:05:08is to make sure to install the skills
00:05:10so that we can be able to pass the knowledge
00:05:11on how to use the CLI to the large language model here,
00:05:14or the AI agents,
00:05:15to be able to connect our cloud code here with our notebook-lm.
00:05:18And then to do so here,
00:05:19you can see this is the entire agent setup.
00:05:20One option here we can do is to install it using CLI,
00:05:23which is using the notebook-lm here to install all the skills.
00:05:26And the other option here,
00:05:27if you wanna use the open skill ecosystem using the NPX,
00:05:30here's the command that you can do so.
00:05:31But honestly, the results that we're getting
00:05:33for two options here are the same.
00:05:34So in this case, I'm gonna copy the first option here
00:05:36to install the skill into our root directory,
00:05:39and so that we can be able to use it
00:05:40for all kinds of projects.
00:05:41So in this case, I'm gonna open a new terminal,
00:05:44paste that command right here.
00:05:45You can see we have a notebook-lm skill here
00:05:47fully installed in our root directory.
00:05:48And now we are having Clogoo here
00:05:50to recognize notebook-lm skills, right?
00:05:53Notebook-lm commands.
00:05:54And simply, we're just gonna reference
00:05:55either using the slash command here,
00:05:57or using a natural language
00:05:59to basically reference the notebook-lm skills
00:06:01that we have set up.
00:06:02All right, so once we know how we can even install
00:06:04our notebook-lm skills and also our CLI,
00:06:06let's take a look at how we can be able to use this
00:06:08in a practical workflow.
00:06:09So right here, you can see I have a product called bookzero.ai,
00:06:12which is a product that I built using AI here
00:06:14to manage bookkeeping for businesses.
00:06:16And what I wanna do here is I wanna use
00:06:18a notebook-lm here to basically analyze
00:06:2035 AI financial competitors that's lives in the CSV data.
00:06:24And I wanna do a deep competitive analysis
00:06:26for each competitors that we have,
00:06:28like understand what it does, selling points, pricing,
00:06:31uniqueness for marketing, and also comparison pages
00:06:34that we're gonna have.
00:06:35And here you can see this the entire architecture
00:06:37on how we're gonna perform this research.
00:06:39So out of all the 35 competitors we have,
00:06:41we actually sort them or organize them into different tiers.
00:06:44So you can see that for tiers here,
00:06:45we have direct competitors, adjacent competitors,
00:06:48and also the tier three competitors that we have.
00:06:50So what we wanna do here is we wanna put the tier one
00:06:52and tier twos into a single notebook
00:06:54because we only have 300 sources
00:06:56that we can insert per notebook.
00:06:58And the first notebook here is gonna be
00:07:00our direct competitors, and the second notebook here
00:07:02is gonna be just the market data.
00:07:04So here you can see what we're gonna do here
00:07:06is we're gonna do a deep research right here,
00:07:08so deep queries for the top eight close competitors
00:07:10that we have, and also 10 fast queries here
00:07:13for the tier two competitors that we have.
00:07:15And roughly the total here is gonna be 250 sources
00:07:18that we're gonna add into this notebook right here.
00:07:20And then for the second notebook here,
00:07:21we're just gonna have a fast research for all 17 of them,
00:07:25and roughly we're gonna get 136 sources
00:07:27inserted into the second notebook.
00:07:29And as the output, we're going to get a report
00:07:31and also a mind map as well as a slide deck
00:07:34on the compare analysis that we have inserted.
00:07:36And that's exactly how we're gonna do this.
00:07:37And then right here you can see
00:07:38this the entire execution steps
00:07:40on how we're gonna achieve this step by step.
00:07:42So in this case, I'm just gonna run this
00:07:44and let's take a look at what the result look like.
00:07:46Quick pause for a second.
00:07:47While I was researching tools around this topic,
00:07:50I ended up testing a platform called Job Write,
00:07:52and it's actually pretty interesting
00:07:54if you're currently job hunting.
00:07:55One thing I've noticed about applying to jobs online
00:07:58is that most of the time isn't spent finding roles.
00:08:01It's spent dealing with the process around them,
00:08:03rewriting resumes, filling out forms,
00:08:05and trying to figure out whether a job is even a good fit.
00:08:08Job Write tries to simplify that whole workflow.
00:08:11When you upload your resume,
00:08:12the platform analyzes it and builds a full profile
00:08:15of your skills, experience,
00:08:17and the kinds of roles that might make sense for you.
00:08:19From there, it starts recommending jobs
00:08:21through their job matching system.
00:08:23And what's helpful is that it doesn't just show listings.
00:08:26It actually explains why a role matches your background.
00:08:29Then there's Resume AI,
00:08:30which can generate tailored versions of your resume
00:08:32based on the job description.
00:08:34So instead of rewriting your resume every time you apply,
00:08:37the system adapts it automatically.
00:08:39The part I thought was particularly useful
00:08:42is their Chrome Autofill extension.
00:08:44Once you answer the common application questions once,
00:08:47it can autofill most job application forms in seconds.
00:08:50They also have something called Insider Connections,
00:08:53which helps you see potential connections
00:08:54inside companies you're applying to.
00:08:56So you're not just sending applications into a black hole.
00:08:59And if you want guidance, there's Orion AI,
00:09:01which basically acts like a career assistant.
00:09:04You can ask it questions about roles, hiring trends,
00:09:07or how to improve your chances for a specific job.
00:09:09Taken together, it feels less like a single tool
00:09:12and more like a platform built to handle
00:09:14the messy parts of job searching.
00:09:16If you want to check it out,
00:09:17you can try JobRite using the link in the description.
00:09:20It's currently free and you can also sign up
00:09:22for early access through the link below.
00:09:24All right, now let's get back to the video.
00:09:26All right, so now you can see as a result,
00:09:27we have five deliverable downloads successfully downloaded
00:09:30inside of our docs folder.
00:09:31So instead of our marketing competitor analysis.
00:09:34So these are the PBT, MD file, and also the JSON file
00:09:37for everything that we have done for the notebook one
00:09:39and notebook two for the research.
00:09:40And here you can see it gives you a complete analysis
00:09:42of the entire MD file for this complete niche
00:09:45that we're currently in.
00:09:46And now if I were to open the slide deck,
00:09:48this is what it exactly looks like.
00:09:50So here you can see we have different slides.
00:09:52They're all generated using that banana two here.
00:09:54And here you can see furthermore,
00:09:55I can also be able to open my notebook here
00:09:57and be able to view the notebooks that we have created.
00:09:59For example, the direct and adjacent notebooks
00:10:01and also the market landscape that we have added.
00:10:04So 300 sources and 171 sources that are added
00:10:07instead of both notebooks right here.
00:10:08So if I were to open one of them, for example,
00:10:11here you can see these are all the resources
00:10:12that we have added.
00:10:13And now if I were to ask any questions, right?
00:10:15For example, based on the book zero product that we have,
00:10:19what is our selling point?
00:10:20How is it unique compared to other competitors that we have?
00:10:23And what should we focus on for the product vision
00:10:25based on the competitor analysis?
00:10:27So if I were to ask this question here,
00:10:29and it should be able to look through all the sources
00:10:30that we have added for the deep research
00:10:32and be able to answer this kind of questions.
00:10:33And here you can see, I basically changed this settings here
00:10:36to be learning guide and keep it shorter for the response.
00:10:39And here you can see,
00:10:40this is the entire response that I'm getting.
00:10:42So your core selling point here is ultra fast,
00:10:44highly accurate for the receipt extractions and matching.
00:10:47And here you can see clearly labeled
00:10:49what the selling point is.
00:10:50And you can see it also gives you a analysis
00:10:53where what are the other competitors are doing, right?
00:10:55So the book zero uniqueness here lies
00:10:57in the hyper simple three-step process upload,
00:11:00import, match workflow,
00:11:01specifically designed to put US and Canadian market here
00:11:04for bookkeeping autopilot,
00:11:06without a higher learning curve.
00:11:08So based on the competitor trends here,
00:11:10the market is aggressively moving
00:11:11towards the conversational AI system,
00:11:13continuously using zero touch bank reconciliations.
00:11:16And for your product vision,
00:11:18you should focus on expanding from receipt matching
00:11:20into a continuous real-time leisure reconciliations,
00:11:23providing automated and actionable financial insights.
00:11:26So that's exactly what it tells me to do
00:11:28for the product visions, very, very short and concise,
00:11:31without having to read through a very long essay.
00:11:33I can simply just set this inside of the settings here
00:11:35for the configuration setting to keep the response short
00:11:38and tell me the exact answer.
00:11:40So there you guys have it.
00:11:41That's basically how you can combine the power of Claw Code
00:11:43and notebook OM to build this insane automations.
00:11:46And in this video,
00:11:47we went over how you can set this up on your local machine
00:11:49and what are some practical use case
00:11:51on how you can use that to building applications, right?
00:11:53Making product decisions
00:11:55or even building any things using Claw Code and notebook OM.
00:11:58And of course, if you're currently building products
00:11:59and you want to improve your product marketing
00:12:01using Claw Code,
00:12:02then be sure to check out this video right here
00:12:04for how to use Claw Code with the 43 skills that are created
00:12:08to improve your product marketing.
00:12:09So be sure to check that out.
00:12:11And so pretty much that's it for this video.
00:12:12And if you do find this video,
00:12:14please make sure to like this video.
00:12:15Consider to subscribe for more content like this.
00:12:17But with that being said, I'll see you in the next video.

Key Takeaway

By integrating NotebookLM-py with Claude Code, users can transform vast amounts of unstructured data into actionable, grounded insights for product development and marketing through an automated AI agent workflow.

Highlights

Introduction of NotebookLM-py, an open-source library that enables AI agents to access NotebookLM features via CLI and Python APIs.

The synergy between Claude Code's execution capabilities and NotebookLM's ability to ground AI in messy documentation and research.

Step-by-step guide for local installation, including virtual environment setup and Google authentication for CLI access.

Real-world application for competitive analysis, processing 35 AI financial competitors into structured knowledge bases.

Automated generation and downloading of research artifacts like Markdown reports, slide decks, and mind maps.

Advanced querying techniques to derive product vision and unique selling points (USPs) from a massive repository of over 470 sources.

Timeline

Introduction to the NotebookLM and Claude Code Workflow

The speaker, Eric, introduces a powerful new workflow that combines Claude Code with NotebookLM-py to bridge the gap between execution and research. He explains that while Claude Code excels at performing tasks, NotebookLM is superior at organizing messy documentation into grounded knowledge for AI agents. A practical use case is teased involving a competitive analysis of 35 different products for a bookkeeping tool called Book Zero. This section highlights how the tool can generate SEO content, blog posts, and product directions based on competitor data. Eric emphasizes that this setup allows for professional-grade analysis that goes beyond standard LLM capabilities.

Project Background and Personal Introduction

Eric provides a brief background of his professional experience as a former senior software engineer at Amazon, AWS, and Microsoft. He explains the mission of his channel, which focuses on AI encoding, automation, Web3, and career development through practical tutorials. This context establishes his credibility in discussing complex AI integrations and software engineering workflows. He also mentions his community platform where viewers can access specific templates and resources mentioned in the video. The transition serves to build trust with the audience before diving into the technical installation steps.

Technical Setup and NotebookLM-py Repository Overview

The tutorial begins with an overview of the NotebookLM-py GitHub repository, which contains the Python APIs and CLI tools necessary for the integration. Eric lists the core features available through the CLI, such as creating notebooks, inserting sources, and setting specific AI personas for chats. He notes that the tool supports deep research modes and allows users to programmatically download generated artifacts like audio, video, and slide decks. This section is crucial because it demonstrates that nearly all web UI functionality is accessible via code. It sets the stage for users to move from manual data entry to automated AI research pipelines.

Installation and Authentication Process

This section provides a detailed walk-through of installing the library on a local machine using a terminal. Eric demonstrates creating a Python virtual environment and using pip to install the notebook-lm package with browser login support. He shows the specific command to authenticate via Google, which saves the necessary credentials in the root directory for future use. Verification of the installation is performed by checking the CLI version to ensure everything is configured correctly. This technical foundation is essential for anyone looking to replicate the agentic workflow on their own hardware.

Integrating NotebookLM Skills with AI Agents

The focus shifts to connecting the installed CLI with AI agents like Claude Code by installing specific 'skills.' Eric explains two methods for this: using the internal notebook-lm install command or using an NPX command for the open skill ecosystem. By adding these skills to the root directory, Claude Code gains the ability to recognize and execute NotebookLM commands through natural language. This step is the 'secret sauce' that allows the AI to understand how to query the knowledge base during a coding or research session. It effectively turns the LLM into a controller for the NotebookLM API.

Case Study: Deep Competitive Analysis for Book Zero

Eric applies the tool to a real-world scenario by analyzing 35 competitors for his product, Book Zero, using CSV data. He outlines a sophisticated architecture that organizes competitors into tiers across two separate notebooks to manage source limits. The workflow involves performing deep queries for top competitors and fast queries for tier-two players, totaling over 470 sources of information. He explains the intended outputs: a comprehensive markdown report, a mind map, and a slide deck for stakeholder presentation. This section illustrates the scale of data that the automated workflow can handle compared to manual research.

Sponsored Segment: Job Searching with Job Rite

The video features a brief sponsorship segment for Job Rite, an AI-powered platform designed to streamline the job hunting process. Eric describes how the tool analyzes resumes to build profiles and recommends roles based on specific skill matches. He highlights features like Resume AI for tailoring documents and a Chrome extension that autofills complex application forms. Additionally, he mentions 'Orion AI,' a career assistant that helps users understand hiring trends and improve their chances for specific roles. This segment connects the theme of AI automation to the broader context of career management and efficiency.

Analyzing Results and Generating Product Insights

Returning to the technical demo, Eric shows the successfully downloaded artifacts, including a detailed Markdown analysis and a slide deck generated with Nano Banana 2. He demonstrates querying the notebook to identify Book Zero's unique selling points, such as its ultra-fast receipt extraction and simple three-step workflow. The AI provides strategic advice, suggesting a move toward conversational AI and real-time ledger reconciliation based on market trends. Eric shows how to adjust configuration settings to keep AI responses short and concise for better readability. This final demonstration proves the utility of the tool in making high-level product decisions.

Conclusion and Final Recommendations

The video concludes with a summary of the benefits of combining Claude Code and NotebookLM for high-level automation. Eric encourages viewers to experiment with the setup for their own applications and product marketing strategies. He references another video regarding 43 specific skills created to enhance product marketing with Claude Code. The wrap-up reminds the audience to like, subscribe, and check the description for all relevant repository links and community resources. This closing provides a clear path forward for users interested in advancing their AI engineering skills.

Community Posts

View all posts