This skill provides instructions and workflows for querying a local DuckDB warehouse (userdata/warehouse.duckdb). It guides agents to list tables, inspect schemas, run SQL queries, ingest CSV/Parquet/JSON files, and export results. It also prescribes an analysis workflow: run queries with -json, summarize metrics, create chart specs, and present results in markdown.
Use when the user asks to explore or analyze project data, load files into the warehouse, produce aggregations (monthly revenue, product performance), or generate charts/tables from query output. Appropriate for ad-hoc data questions like “show top products by revenue” or “load this CSV and preview rows.”
Agents that can run shell commands and SQL (CLI-capable agents) such as Copilot/Code-style agents and automation agents that can execute DuckDB binaries.
This skill has not been reviewed by our automated audit pipeline yet.
Dagster
Build and manage Dagster data pipelines -- create assets, jobs, schedules, sensors, and resources.
Dlt
Build data ingestion pipelines with dlt (data load tool) -- extract from APIs, databases, and files, then load to any destination.
dbt CLI Assistant
Run and manage dbt projects via the dbt CLI — initialise projects, run/build models, run tests, generate docs, and debug pipelines.
Metabase (dashboard & questions manager)
Manage Metabase instances: create and run questions, manage dashboards and collections, and interact with the Metabase REST API for analytics workflows.
PostgreSQL
Query and manage PostgreSQL databases via psql: run queries, inspect schemas and tables, check active connections, and perform basic administration and exports.
BigQuery
Query and manage Google BigQuery datasets with the bq CLI: run SQL, inspect schemas, list tables, load CSV/JSON, and manage partitioning.