I Got Tired of Hunting for API Code, So I Made an MCP
Get the tool:
- MCP Civic Data — 7 government and open data APIs in one place
- All my tools — Full collection of agents, skills, and plugins
Quick Start: Add to your Claude config
json{ "mcpServers": { "civic-data": { "command": "python", "args": ["-m", "mcp_govt_api"], "env": { "OPENWEATHER_API_KEY": "your-key-here", "NASA_API_KEY": "optional-for-higher-limits" } } } }
Or install directly:
bashpip install mcp-civic-data
Every developer has that folder. The one full of old projects, half-finished experiments, and random scripts you wrote at 2am to solve a problem you'll definitely face again someday.
Mine is full of API integrations. Weather data for a dashboard I built three years ago. Census queries from a demographics project. NASA image fetching for something I don't even remember anymore. World Bank economic indicators for a data visualization thing.
Every time I need one of these APIs again, I go digging. Which repo was that in? What was the authentication pattern? Did I handle rate limiting? Where's that helper function that parsed the response correctly?
I finally got tired of it.
The Problem with Scattered Code

Over the years I've built projects that touched all kinds of public data sources. Government APIs are surprisingly useful and mostly free, but they all have different authentication patterns, response formats, and quirks.
NOAA weather data requires understanding their grid-based forecast system. The Census API has its own query language. NASA has multiple endpoints with different rate limits. The World Bank API returns nested JSON that needs careful parsing.
Each time I figured one of these out, I wrote code that worked. Then that code sat in whatever project needed it at the time. When the next project came along and needed the same data, I either rewrote it from scratch or went spelunking through old repos trying to find the original.
This is a stupid way to work.
Wrapping Everything in One Place

The solution was obvious once I started thinking about it. Take all these API integrations I've built over the years, clean them up, and expose them through a single MCP server. Now Claude can access any of them directly, and I never have to hunt for that code again.
The result is 22 tools covering 7 different data sources:
| API | What You Get | |-----|--------------| | NOAA Weather | US forecasts, alerts, radar data | | OpenWeather | Global conditions (needs free API key) | | US Census | Population, demographics, housing stats | | NASA | Astronomy photos, Mars rover images, media library | | World Bank | GDP, poverty indicators, country comparisons | | Data.gov | Search 300,000+ US government datasets | | EU Open Data | European Union dataset portal |
Most of these don't even require API keys. NOAA, Census, World Bank, Data.gov, and EU Open Data are all free and open. NASA works without a key but gives you higher rate limits with one. Only OpenWeather requires registration for global weather data.
How It Actually Works
The beauty of wrapping these in an MCP is that I don't have to remember anything. I just ask for what I need.
"What's the weather forecast for Austin?" Claude calls
get_weather_forecast"Show me demographics for Harris County, Texas." Claude calls
get_demographics"Find Mars rover photos from last week." Claude calls
get_mars_rover_photos"Compare GDP growth between the US, China, and Germany." Claude calls
compare_countriesNo more hunting through old projects. No more rewriting the same parsing logic. No more forgetting which endpoint needs which headers.
The Tools
Here's everything exposed through the MCP:
Weather (5 tools)
- for 7-day US forecasts
get_weather_forecast - for state-specific warnings
get_weather_alerts - for worldwide current conditions
get_global_weather - Plus raw access to NOAA and OpenWeather APIs
Census (4 tools)
- for state and county numbers
get_population - for detailed breakdowns
get_demographics - for property values and rent data
get_housing_stats - Plus raw Census API access for custom queries
NASA (4 tools)
- for the daily space image
get_astronomy_photo - from active missions
get_mars_rover_photos - across their media library
search_nasa_images - Plus raw NASA API access
World Bank (3 tools)
- for economic metrics
get_country_indicators - for side-by-side analysis
compare_countries - Plus raw API access
Data Portals (6 tools)
- Search and retrieve from both Data.gov and EU Open Data
- Access to over 300,000 government datasets
Why This Approach Works
The real value isn't just having the APIs available. It's having them available in a way that matches how I actually work now.
When I'm building something and need weather data, I don't want to stop and figure out API authentication. I want to ask for weather data and get weather data. When I need to look up population statistics, I don't want to remember Census API query syntax. I want to ask a question and get an answer.
Wrapping all of this in an MCP means the integration work I did years ago keeps paying dividends. The code is cleaned up, tested, and accessible through a consistent interface. I'll never have to write another NOAA grid coordinate lookup again.
The Bigger Picture
This is the pattern I keep coming back to. Every time I solve a problem, I ask myself: will I need this again? If the answer is yes, it goes into an MCP or a skill or a plugin. Something reusable. Something I can access without thinking about it.
The civic data MCP is just the most recent example. I've done the same thing with SSH connections, Proxmox management, vector databases. Every tool I build becomes part of a growing toolkit that makes the next project easier.
If you've got a folder full of old API integrations gathering dust, maybe it's time to wrap them up too.