Skip to content
On this page
Product
2025-12-10

GreptimeDB MCP Server v0.3: Data Masking, Pipeline Management, and Upgraded Prompts

GreptimeDB MCP Server v0.3 adds data masking for secure AI queries, tools for managing log pipelines, and production-ready prompt templates with documentation references.

Model Context Protocol (MCP) is Anthropic's open protocol for connecting AI assistants to external tools and data. GreptimeDB MCP Server v0.3 (latest: v0.3.1) adds data masking for secure AI queries, tools for managing log pipelines, and production-ready prompt templates with documentation references.

Breaking News: With the Linux Foundation establishing the Agentic AI Foundation, Anthropic has donated the MCP protocol to the foundation. Also donated are Block's Goose and OpenAI's AGENTS.md, among other projects.

If you're new to GreptimeDB MCP Server, check out our introduction blog post for background on how MCP bridges LLMs and observability databases.

What's New

Data Masking

When LLMs query databases, sensitive data can leak into responses. Ask Claude to query a users table, and you might see passwords or API keys in the output. v0.3 automatically masks credentials, financial data, and personal identifiers before they reach the LLM.

Built-in patterns cover three categories:

  • Authentication: password, passwd, pwd, secret, token, api_key, apikey, access_key, private_key, credential, auth, authorization
  • Financial: credit_card, creditcard, card_number, cardnumber, cvv, cvc, pin, bank_account, account_number, iban, swift
  • Personal: ssn, social_security, id_card, idcard, passport

Matched values appear as ****** in all output formats. Configure via environment variables:

bash
# Disable masking (enabled by default)
GREPTIMEDB_MASK_ENABLED=false

# Add custom patterns (comma-separated, extends defaults)
GREPTIMEDB_MASK_PATTERNS=phone,address,email

Data Masking Example

TQL and RANGE Query Tools

Three new tools for time-series analysis:

ToolDescription
execute_tqlRun PromQL-compatible queries with start, end, step, and optional lookback
query_rangeExecute RANGE/ALIGN aggregations with time-window semantics
explain_queryAnalyze SQL or TQL execution plans; use EXPLAIN ANALYZE for actual metrics

Example TQL query:

json
{
  "query": "rate(http_requests_total[5m])",
  "start": "2024-01-01T00:00:00Z",
  "end": "2024-01-01T01:00:00Z",
  "step": "1m"
}

For more on TQL syntax, see the TQL documentation.

Pipeline Management: AI-Powered Log Parsing

GreptimeDB's log pipeline transforms raw logs into structured data. v0.3 adds MCP tools to manage pipelines directly:

ToolDescription
create_pipelineCreate pipelines with YAML configuration
dryrun_pipelineTest pipelines without writing to the database
delete_pipelineRemove specific pipeline versions
list_pipelinesView existing pipelines and versions

The pipeline_creator prompt helps LLMs generate pipeline configs from log samples. It covers:

  • Processors: dissect, regex, date, epoch, gsub, select
  • Transform configuration and data types
  • Index best practices: inverted, fulltext, skipping
  • Table design for log data

See this example where Claude builds a pipeline from nginx logs, tests it with dryrun_pipeline, and deploys with create_pipeline.

AI-Powered Log Parsing

For pipeline configuration details, refer to the Pipeline Configuration Reference.

Production-Ready Prompts

All seven prompt templates now include References sections linking to official docs:

TemplateUse CaseKey References
pipeline_creatorGenerate pipelines from log samplesPipeline Config, Data Index
log_pipelineLog analysis with full-text searchFull-Text Search, SQL Functions
metrics_analysisMetrics monitoringRANGE Query, Data Model
promql_analysisPromQL/TQL guidanceTQL Reference
iot_monitoringIoT device analysisTable Design
trace_analysisDistributed tracing (OpenTelemetry)Traces Overview
table_operationSchema diagnosticsINFORMATION_SCHEMA

We also fixed query syntax issues in the trace_analysis and table_operation templates.

Other Improvements

  • HTTPS Support: Connect over TLS with GREPTIMEDB_HTTP_PROTOCOL=https or --http-protocol https
  • Security Gate Fix: SHOW CREATE TABLE now works correctly
  • Performance: HTTP connections reuse aiohttp.ClientSession
  • Output Formats: All query tools support csv, json, and markdown

Getting Started

Install via pip:

bash
pip install greptimedb-mcp-server

Or run with uv:

bash
uv run -m greptimedb_mcp_server.server

Configure your MCP client (Claude Desktop, Cursor, etc.):

json
{
  "mcpServers": {
    "greptimedb": {
      "command": "greptimedb-mcp-server",
      "args": [
        "--host", "localhost",
        "--port", "4002",
        "--database", "public"
      ]
    }
  }
}

For pipeline management, ensure the HTTP port (default 4000) is accessible.

What's Next

This release upgrades the MCP server to use the FastMCP API, currently with stdio transport. Next, we plan to support HTTP SSE transport, enabling the server to run persistently as a standalone service. See Issue #23 for details.

Learn More

Join our community

Get the latest updates and discuss with other users.