Back to writing

May 24, 2025 · essay

My Evolving AI Power Stack: 30+ Tools Shaping Modern AI Solutions & Workflows

Explore my updated arsenal of 30+ essential tools – from local LLMs on NVIDIA GPUs, versatile model access via OpenRouter & Perplexity, to n8n automation, NotebookLM, Vue3/Nuxt3, and core dev stacks – that power modern AI application development.

My Evolving AI Power Stack: 30+ Tools Shaping Modern AI Solutions & Workflows header image

The short version

LLM
This post outlines my personal toolkit of over 30 essential technologies, from core AI models and local hardware to development environments and cloud services, used in daily AI application and workflow automation.
Why
To demystify the AI development landscape by showcasing a curated set of tools, explaining their roles and importance for building sophisticated, modern AI solutions to both technical and non-technical audiences.
Challenge
Effectively selecting and integrating a diverse range of tools in a rapidly evolving AI field. This post categorizes these tools to provide clarity on how they contribute to different stages of development, now including tools like NotebookLM and OpenRouter.
Outcome
A comprehensive overview of a practical 'power stack' that enables private, performant, and versatile AI development, emphasizing the synergy between local AI, flexible model access, powerful frameworks, automation, and core programming.
AI approach
While this post details the tools I use (often with AI assistants like Google Gemini for code generation), it highlights the ecosystem supporting an AI-First development philosophy, where humans orchestrate these tools and AI partners to build innovative solutions.
Learnings
Building advanced AI solutions requires a multifaceted toolkit. Local hardware (NVIDIA GPUs with Ollama) is crucial for private AI. Flexible model access (OpenRouter, Perplexity) and powerful frameworks (Vue3/Nuxt3, FastAPI) combined with automation (n8n) and solid dev practices (VS Code, Python, JS, Docker, GitHub) are key.

Listen to an audio version of this post:

Introduction: The Modern AI Artisan's Workshop

The world of Artificial Intelligence is buzzing with an energy that's reshaping how we think, work, and create. It's an exhilarating time to be building, but let's be clear: crafting truly impactful AI solutions isn't about finding a single "magic bullet" tool. Instead, it’s more like equipping a master craftsperson's workshop. You need a diverse array of specialized tools, from heavy-duty machinery for foundational tasks to precision instruments for the fine-finishing touches.

Today, I want to pull back the curtain and share my curated toolkit – the collection of software, services, libraries, and even hardware that powers my daily development of sophisticated AI applications and automated workflows. This isn't an exhaustive list meant for everyone, but rather a practical glimpse into the ecosystem that helps me (and my AI collaborators) bring complex ideas to life.

My hope is to make this journey accessible, whether you're a seasoned developer, a curious product manager, a potential client, or simply someone intrigued by what goes into building with AI today. I'll explain the role of each tool in simple terms, highlighting why it's earned a spot in my ever-evolving arsenal – because in this dynamic field, the toolkit is always growing and adapting!

Categorizing the Toolkit: The Instruments of AI Creation

To make sense of this landscape, I've grouped my go-to tools into categories based on their primary role in the development lifecycle.

AI Toolkit Overview

1. The AI Brains: Interacting with & Accessing Large Language Models

These are the engines and gateways of modern AI, allowing us to generate text, analyze information, understand images, and much more. They are the core intelligence we interact with and orchestrate.

  • Google AI Studio (with Gemini 2.5 Pro Preview 05-06)
    • What it is: An online platform from Google for experimenting with and building applications using their latest and most powerful AI models, like Gemini 2.5 Pro.
    • Why it's in my Toolkit: My primary interface for deep, iterative collaboration with Google's cutting-edge AI. The extensive context window and multimodal capabilities of Gemini 2.5 Pro are game-changers for AI-first development. (Link: ai.google.dev/aistudio)
  • Ollama
    • What it is: A fantastic tool that lets you run powerful open-source LLMs (like Llama 3, Mistral, etc.) directly on your own computer.
    • Why it's in my Toolkit: Essential for local AI development, ensuring data privacy, offline capabilities, and cost-effective experimentation. (Link: ollama.com)
  • OpenAI API
    • What it is: A service from OpenAI that allows developers to integrate their latest and most capable models—such as the GPT-4o (omni-model for text and vision), GPT-4.1 series (developer-focused), OpenAI o3 and o4-mini (reasoning models), DALL·E 3 (image generation), and Whisper (speech-to-text)—into applications. The Assistants API is also key for building agent-like experiences.
    • Why it's in my Toolkit: Provides access to a suite of industry-leading AI models, reliable for cloud-based LLM tasks requiring specific model strengths. (Link: platform.openai.com)
  • Google Gemini APIs
    • What it is: Programmatic access to Google's powerful Gemini family of models (like Gemini 2.5 Pro, Flash, etc.) for integration into custom applications.
    • Why it's in my Toolkit: Offers another excellent avenue to leading-edge multimodal AI capabilities from Google. (Link: ai.google.dev/docs/gemini_api_overview)
  • OpenRouter
    • What it is (Simplified): A unified API gateway that lets you access a vast array of LLMs from different providers (OpenAI, Anthropic, Google, Mistral, open-source models, etc.) using a single API key and a consistent request format.
    • Why it's in my Toolkit: The ultimate tool for AI 'polyamory'! It makes experimenting with, comparing, and switching between different LLMs incredibly easy, allowing me to pick the best (or most cost-effective) model for any specific task without rewriting integration code. Essential for A/B testing models or building workflows that dynamically route to different AIs. (Link: openrouter.ai)
  • Hugging Face
    • What it is: A massive hub for the open-source AI community, offering models, datasets, and tools.
    • Why it's in my Toolkit: An indispensable resource for finding and experimenting with a wide array of open-source AI models. (Link: huggingface.co)
  • Perplexity AI
    • What it is (Simplified): An AI-powered search and conversational answer engine that excels at research by providing answers with cited sources. It also offers access to various leading LLMs for Pro users (like GPT-4 Omni, Claude 3.5 Sonnet/Haiku, Grok-2, Gemini 2.0 Flash, etc.).
    • Why it's in my Toolkit: My go-to for quick, reliable research and getting up to speed on new topics. Its ability to cite sources is invaluable. For Pro users, it's also a great way to test queries against a diverse set of powerful models without needing separate API keys for each, excellent for initial exploration and model comparison. (Link: perplexity.ai)

2. The Local Powerhouse: Hardware for Private & Performant AI

Running powerful AI locally isn't just about software; the right hardware is the foundation.

  • NVIDIA RTX 4090 (Desktop GPU) & RTX 4070 (Laptop GPU)
    • What they are (Simplified): High-performance graphics cards that act as superchargers for local AI tasks.
    • Why they're in my Toolkit: Crucial for running larger Ollama models quickly and efficiently, enabling rapid, private AI development and testing. The 4090 for heavy lifting, the 4070 for portability. (Link for NVIDIA AI: nvidia.com/en-us/ai-data-science/)
    • (The synergy between these NVIDIA GPUs and Ollama is so fundamental to my local AI workflow, I'm planning a dedicated future post on this!)

3. Crafting the Code & Handling Data: Languages, Libraries & Environments

The fundamental tools for writing software and managing the data that fuels AI.

  • Python
    • What it is: A versatile programming language, popular in AI and data science.
    • Why it's in my Toolkit: Extensive libraries for AI/ML, data manipulation (Pandas), and web frameworks (FastAPI). (Link: python.org)
  • Pandas
    • What it is (Simplified): A powerful Python library for data analysis and manipulation, like supercharged spreadsheets in code.
    • Why it's in my Toolkit: Essential for loading, cleaning, transforming, and analyzing structured data for AI models. (Link: pandas.pydata.org)
  • JavaScript (and TypeScript)
    • What it is: The primary language for web browsers (JS), with TypeScript adding optional static typing.
    • Why it's in my Toolkit: Used for all frontend development (Vue.js, Nuxt.js) and Node.js environments like Electron. (JS Info: developer.mozilla.org, TS: typescriptlang.org)
  • VS Code (Visual Studio Code) + Copilot
    • What it is: VS Code is a popular code editor. Copilot is an AI pair programmer suggesting code.
    • Why it's in my Toolkit: VS Code is my primary editor. Copilot speeds up coding with intelligent suggestions. (VS Code: code.visualstudio.com, Copilot: copilot.github.com)

4. Building the Application Logic: Frameworks & Backend Services

Tools that help structure applications and efficiently manage complex operations.

  • FastAPI
    • What it is: A modern, fast web framework for building APIs with Python.
    • Why it's in my Toolkit: Preferred for robust Python backends serving AI models or orchestrating workflows. (Link: fastapi.tiangolo.com)
  • Pydantic (for AI Data & Settings)
    • What it is (Simplified): A Python library for data validation and settings management using Python type hints. It ensures data conforms to a defined schema.
    • Why it's in my Toolkit: Absolutely critical for building reliable AI applications. I use Pydantic models to define the expected structure of data going into LLMs (e.g., for function calling or structured prompting) and to validate the (often JSON) outputs coming from them. This makes LLM interactions far more predictable and less prone to errors, which is essential when building agents or any system that depends on reliable data flow from AI models. It's the bedrock for ensuring data integrity in AI pipelines. (Link: pydantic.dev and relevant AI integrations like Instructor)

5. Automating Everything: Workflow & Integration Platforms

Connecting disparate systems and automating sequences of tasks is key to efficiency.

  • n8n.io
    • What it is: A powerful, extendable, source-available workflow automation tool.
    • Why it's in my Toolkit: My central hub for creating complex automations tying together APIs, AI models, and services. (Link: n8n.io)
  • Google Workspace Integrations
    • What it is: Google's suite of productivity tools (Gmail, Sheets, Drive).
    • Why it's in my Toolkit: Used as triggers/actions in n8n workflows and for AI integration. (Link: workspace.google.com)
  • Telegram
    • What it is: A popular messaging app with a robust bot API.
    • Why it's in my Toolkit: Excellent for building custom AI chatbots and notification systems integrated with n8n. (Link: telegram.org)
Workflow & Integration Platforms Comparison

6. Creating User Interfaces & Experiences: Web, Desktop & Rapid Prototypes

How users interact with the AI applications we build.

  • HTML, CSS, JavaScript (The Core Web Stack)
    • What they are: The fundamental building blocks of all websites and web applications.
    • Why they're in my Toolkit: Essential for any web-based frontend. (Link: developer.mozilla.org/en-US/docs/Web)
  • Vue.js (Vue 3)
    • What it is: A progressive JavaScript framework for building user interfaces.
    • Why it's in my Toolkit: My go-to for creating reactive, component-based web frontends. (Link: vuejs.org)
  • Nuxt (Nuxt 3)
    • What it is: A full-stack framework built on top of Vue.js, simplifying SSR apps, static sites, and more.
    • Why it's in my Toolkit: For comprehensive Vue applications needing server-side capabilities or static generation. (Link: nuxt.com)
  • Electron
    • What it is: A framework for creating native desktop applications using web technologies.
    • Why it's in my Toolkit: Enables me to build cross-platform desktop AI applications that can leverage local resources securely. (Link: electronjs.org)
  • Streamlit
    • What it is: An open-source Python library for creating custom web apps for ML and data science.
    • Why it's in my Toolkit: Perfect for rapidly building interactive UIs and dashboards for AI tools in Python. (Link: streamlit.io)
UI Frameworks Comparison

7. Deploying & Managing: Cloud, Infrastructure & Networking

Getting applications online, ensuring they run reliably, and connecting them securely.

  • Vercel
    • What it is: A cloud platform for frontend frameworks and static sites.
    • Why it's in my Toolkit: Preferred for deploying frontend applications quickly. (Link: vercel.com)
  • Cloudflare
    • What it is: A global network offering web performance, security, and Zero Trust networking tools.
    • Why it's in my Toolkit: Essential for securing web apps, improving performance, and creating secure tunnels (Cloudflare Tunnel) to expose local services. (Link: cloudflare.com)
  • Docker
    • What it is: A platform for developing, shipping, and running applications in containers.
    • Why it's in my Toolkit: Essential for packaging applications with complex dependencies. (Link: docker.com)
  • Google Cloud Console (GCP)
    • What it is: Google's suite of cloud computing services.
    • Why it's in my Toolkit: Provides a wide range of services for advanced cloud needs. (Link: console.cloud.google.com)
  • GitHub
    • What it is: A web-based platform for version control (Git) and collaboration.
    • Why it's in my Toolkit: The backbone for all my code: version control, collaboration, CI/CD. (Link: github.com)
Essential Tools for Cloud Management

8. Specialized AI Toolkit: Data Ingestion and Parsing

Tools designed to prepare data for AI or enhance interaction with LLMs.

  • Crawl4AI
    • What it is: A Python library for web crawling and content extraction for AI applications.
    • Why it's in my Toolkit: Streamlines getting clean, AI-ready content from websites for custom knowledge bases. (Link: docs.crawl4ai.com)
  • LlamaParse
    • What it is: A service from LlamaIndex for parsing complex documents like PDFs with advanced layout understanding.
    • Why it's in my Toolkit: My go-to for industrial-strength OCR and parsing of challenging PDFs for LLM ingestion. (Link: docs.llamaindex.ai/llamaparse/)
  • code2prompt
    • What it is: A command-line tool to consolidate and prepare codebases as context for LLMs.
    • Why it's in my Toolkit: Invaluable for providing large, relevant code context to AI models. (Link: code2prompt.dev/docs/welcome/)

9. Ideas, Knowledge Management, Learning & Community: The Essential Ecosystem

Staying updated, brainstorming, managing knowledge, and connecting with others.

  • NotebookLM
    • What it is (Simplified): An AI-powered research and writing assistant from Google that grounds its responses in your uploaded source documents (up to 50 large sources including PDFs, text, Google Docs, websites, YouTube transcripts). It also features an audio overview (podcast-like) of your sources.
    • Why it's in my Toolkit: Incredibly powerful for synthesizing information from multiple, diverse sources, asking questions about my documents, and even getting audio summaries for on-the-go learning. Its source-grounding is key for accuracy. (Link: notebooklm.google.com)
  • tldraw
    • What it is: An online whiteboard for sketching and diagramming.
    • Why it's in my Toolkit: Perfect for quickly whiteboarding architectures and ideas. (Link: tldraw.com)
  • NapkinAI
    • What it is: An AI-powered tool for capturing and connecting ideas.
    • Why it's in my Toolkit: Helps connect disparate thoughts for long-term project ideation. (Link: app.napkin.ai)
  • Skool
    • What it is: A platform for building online communities.
    • Why it's in my Toolkit: Great for communities focused on AI and workflow automation. (Link: skool.com)
  • YouTube
    • What it is: The ubiquitous video-sharing platform.
    • Why it's in my Toolkit: Unparalleled resource for tutorials, AI news, and insights. (Link: youtube.com)

The Synergy: How These Tools Work Together

The real magic happens when these tools are combined. For example, I might research a topic using Perplexity AI and synthesize findings with NotebookLM, sketch an initial workflow idea in tldraw, then use Python with FastAPI and Ollama (powered by my NVIDIA GPU) to build a local AI microservice, containerized with Docker. An Electron app using Vue3/Nuxt3 could provide the desktop UI, or a Streamlit app for rapid prototyping. Crawl4AI or LlamaParse might gather data for this service, orchestrated by an n8n workflow that also pushes results to Google Sheets, with secure external access managed by Cloudflare. All code is managed on GitHub, and a user-facing web component could be deployed via Vercel, leveraging OpenRouter if flexible model access is needed. This interplay allows for rapid, flexible, and powerful AI development.

AI Development Workflow

Conclusion: Building the Future, One Tool at a Time

This snapshot of my current toolkit reflects the dynamic and multifaceted nature of AI development today. It’s a blend of powerful AI models, robust programming languages, versatile frameworks, efficient automation platforms, and essential learning resources. The "and counting!" is a reminder that this field is constantly evolving, demanding continuous learning and adaptation.

Building sophisticated AI solutions is less about finding one perfect tool and more about skillfully orchestrating a suite of them. By understanding the strengths of each component, we can architect systems that are intelligent, efficient, and truly solve real-world problems.

What are your indispensable AI development tools? I'd love to hear about the "power stack" that fuels your innovations!