Most content publishers overpay for data they could get free from government APIs. For local homeowner content specifically, the US government publishes authoritative housing, climate, labor, and finance data that no commercial provider can match on price or coverage. Zillow charges for their data. ATTOM charges for their data. BrightMLS charges for their data. The Census Bureau, NOAA, BLS, FHFA, USDA, and HUD charge nothing - and their data is primary-source, not aggregated or estimated from third-party feeds.

This guide covers all six sources in depth: the exact API endpoints, authentication requirements, rate limits, which datasets matter for homeowner content, and the specific use cases that turn raw numbers into rankable local pages. If you want to understand how to wire these sources together into a full content pipeline, see our guide to building a local SEO site using free government data. For managing data freshness across a large page inventory, read content freshness for programmatic SEO at scale.

1. US Census Bureau ACS API

Base URL: https://api.census.gov/data/2022/acs/acs5
Authentication: Free API key from census.gov/developers
Rate limits: 500 requests/day without a key, 50 requests/minute with a key

The American Community Survey is the single most valuable source for local homeowner content. It covers every ZIP code, county, and metro area in the country with housing data that no private provider can replicate at this level of geographic granularity. The key tables for homeowner content are:

  • B25034 - Year structure built. Gives you the distribution of housing stock age by decade, which drives content about renovation trends, electrical panel risks, and insulation quality.
  • B25040 - House heating fuel type. Whether homes use natural gas, electric, fuel oil, or wood heat - drives utility cost estimates and HVAC content.
  • B25003 - Tenure (owner vs. renter). The percentage of owner-occupied homes affects how you frame content and which search intents matter in that market.
  • B25077 - Median home value. The foundation of any housing market page.
  • B19013 - Median household income. Enables affordability calculations: what percentage of income goes to housing in this city.

A typical query for ZIP-level data looks like this:

https://api.census.gov/data/2022/acs/acs5?get=B25077_001E,B19013_001E,B25003_002E,B25003_003E&for=zip%20code%20tabulation%20area:78701&key=YOUR_KEY

The critical tip most developers miss: always use 5-year estimates (acs5) for ZIP-level data. The 1-year estimates (acs1) have too many suppressed values for small geographies - you will get blanks for any ZIP with fewer than 65,000 residents. The 5-year estimates pool five years of survey data, which gives you reliable numbers even for small ZIPs. The trade-off is that the data is slightly older, but for structural housing characteristics that change slowly, this is a reasonable trade.

The use case is enriching city guide pages with specific facts: "58% of homes in this ZIP are owner-occupied," "62% of homes were built before 1980 - which means most have original electrical panels that may not support modern loads." These are the kinds of statements that differentiate a data-backed page from a generic city guide.

2. NOAA Climate Data Online API

Base URL: https://www.ncei.noaa.gov/cdo-web/api/v2
Authentication: Free token from ncei.noaa.gov/cdo-web/token
Rate limits: 1,000 records per request, 10,000 requests per day

The National Centers for Environmental Information publishes 30-year climate normals covering roughly 9,800 weather stations across the US. For homeowner content, the most useful dataset is NORMAL_MLY - monthly climate normals for the 1991-2020 baseline period. The variables that drive the most content value are:

  • MLY-PRCP-NORMAL - Average monthly precipitation. Drives gutter cleaning schedules and drainage content.
  • MLY-TMAX-NORMAL and MLY-TMIN-NORMAL - Average monthly high and low temperatures. The foundation for any seasonal maintenance calendar.
  • MLY-SNOW-NORMAL - Average monthly snowfall. Drives roof load, snow removal, and ice dam content.
  • DX32 - Average number of days per month with minimum temperature below 32F. This is the pipe freeze risk signal.
  • DX90 - Average days per month above 90F. Drives HVAC maintenance and cooling cost content.
  • ANN-HTDD-NORMAL - Annual heating degree days. Enables home heating cost estimates when combined with fuel type from Census B25040.

The content this enables is specific and genuinely useful. Instead of "winterize your pipes before it gets cold," you can write: "Austin typically experiences 3 days below freezing in January - the freeze risk window runs from late December through mid-February. Pipe insulation in exposed crawl spaces and attic runs is worth doing by December 15." That level of specificity comes directly from the DX32 normals for Austin-area weather stations.

One important caveat: the 30-year normals use the 1991-2020 baseline. This is the current official baseline period and will not be updated until the 2031 release (covering 2001-2030). For most structural content the normals are accurate enough, but for pages that reference recent extreme weather patterns, note that climate has shifted meaningfully in some regions since the baseline period.

Finding the right station for a given city requires a two-step lookup: first find nearby stations using the /stations endpoint filtered by datasetid=NORMAL_MLY and a bounding box, then pull the normals for the closest station by distance to your target city center.

3. BLS Occupational Employment and Wage Statistics API

Base URL: https://api.bls.gov/publicAPI/v2/timeseries/data
Authentication: Optional registration key (strongly recommended)
Rate limits: 25 series per request without key, 50 series per request with key

The Bureau of Labor Statistics publishes Occupational Employment and Wage Statistics (OEWS) annually covering wages for every occupation in every metropolitan statistical area. For homeowner contractor cost content, this is the data that makes your cost guides actually defensible. Instead of "a plumber in Denver costs $150-$300 per hour depending on the job," you can say "plumbing labor in Denver averages $28.40/hour at the median - above the $22.10 national median - so project costs in this market run 28% above national averages."

The relevant SOC occupation codes for homeowner content are:

  • 47-2152 - Plumbers, Pipefitters, and Steamfitters
  • 47-2111 - Electricians
  • 47-2181 - Roofers
  • 49-9021 - Heating, Air Conditioning, and Refrigeration Mechanics and Installers
  • 47-2031 - Carpenters
  • 47-2141 - Painters and Paperhangers

BLS series IDs are constructed by combining a prefix, survey code, area code, occupation code, and data type code. For OEWS metro-level data, the format is: OEUM{area_code}{occ_code}03 where 03 is the code for median hourly wage. The area code for Denver-Aurora-Lakewood CO is 0022820, so the series ID for median plumber wages in Denver is OEUM002282047215203.

The OEWS data releases annually in May. Between releases, treat the data as valid - construction wages do not shift dramatically month to month. Cache it for 90 days and set a re-fetch trigger for the first week of June each year to pick up the new release.

4. FHFA House Price Index

Access method: CSV file downloads (no live API)
Data URL: https://www.fhfa.gov/DataTools/Downloads/Pages/House-Price-Index-Datasets.aspx
Authentication: None required

The Federal Housing Finance Agency does not offer a live REST API - the HPI data is published as downloadable CSV files updated quarterly. This means you need to build your own data pipeline: download the CSV on a quarterly schedule, parse it, and store it in a local database that your page generation code queries.

The two most useful files are:

  • HPI_AT_metro.csv - All-transactions house price index by metropolitan statistical area, quarterly going back to 1975 for most metros. This is your primary source for "home prices in the Austin-Round Rock MSA have increased X% over the last 5 years."
  • HPI_master.csv - The master file containing all geographies including the ZIP-level expanded dataset, though ZIP-level coverage is less complete than metro-level.

Converting the raw HPI index to percentage change requires simple math: divide the current index value by the value from N quarters ago and subtract 1. The index is not a price - it is a relative measure anchored to a base period. A metro with an index of 420 in Q4 2025 versus 350 in Q4 2020 has appreciated (420/350 - 1) = 20% over 5 years.

For content use, combine FHFA HPI with Census B25077 (median home value) to produce statements like: "The Austin-Round Rock MSA median home value is approximately $425,000 (2022 ACS), and per FHFA data, values in this metro have appreciated 22% over the past 5 years - the 7th fastest appreciation rate among the top 50 metros." Neither data point alone tells the full story. Together they give readers a real market picture.

5. USDA Plant Hardiness Zone Lookup

Access method: ZIP code lookup CSV download
Data URL: https://planthardiness.ars.usda.gov
Authentication: None required

The USDA does not maintain a rate-limited live API for plant hardiness zones. The best approach is to download the ZIP-to-zone lookup CSV from the USDA Agricultural Research Service, store it locally as a lookup table in your database, and query it directly. The 2023 updated map (the most recent revision) uses higher-resolution temperature data than the previous 2012 version and reflects warming trends - about half of the US shifted to a warmer half-zone.

The hardiness zone system uses 13 zones from 1a (average annual minimum of -60F) to 13b (average annual minimum of 65F), each divided into an "a" (colder) and "b" (warmer) half-zone. For content, the practical use is driving gardening and landscaping sections of maintenance calendars:

  • Zone 5b (Chicago area, -15 to -10F minimum): "Protect tree roses, fig trees, and any borderline-hardy perennials before Thanksgiving. Freeze risk starts in late October."
  • Zone 8b (Austin/Dallas area, 15 to 20F minimum): "Most tropical-looking plants survive winters here, but protect bananas, citrus, and bougainvillea when temperatures drop below 28F. Irrigation systems should be winterized when overnight lows hit 35F."
  • Zone 10a (Miami/Southern California, 30 to 35F minimum): "Frost events are rare but real - cover citrus and tropical foliage on nights forecast below 35F. Irrigation never needs winterizing."

This content segment is highly searchable ("when to plant [plant] in [city]", "gardening zone [zip code]") and impossible to produce accurately without the USDA zone data as a foundation. The lookup table is roughly 30,000 rows - well within what any SQL database handles trivially.

6. HUD Fair Market Rents API

Base URL: https://www.huduser.gov/hudapi/public/fmr
Authentication: Free token from huduser.gov/portal/dataset/fmr-api.html
Data: Fair market rents by metro area and county for studio through 4BR

The Department of Housing and Urban Development publishes Fair Market Rents (FMR) annually for every metropolitan and non-metropolitan area in the country. FMRs are HUD's estimate of the 40th percentile gross rent - the rent at which 40% of market-rate rentals fall below. While that is not the same as median rent, it is an officially published and annually updated benchmark that gives local rent context for any city.

The API is straightforward. A county-level query for Travis County, Texas (FIPS 48453) looks like:

GET https://www.huduser.gov/hudapi/public/fmr/statedata/TX
Authorization: Bearer YOUR_TOKEN

For homeowner content, FMR data drives rent-vs-buy analysis sections. If the HUD FMR for a 3-bedroom in Austin is $1,842/month, that is $22,104/year in rent. A 30-year mortgage on a $425,000 home (20% down, 6.5% rate) runs approximately $2,150/month - $308 more than renting. But that mortgage payment builds equity, the rent payment does not, and the comparison looks different in 5 years. This is the kind of analysis that answers real questions homeowners are searching for, and it is only possible because HUD publishes FMR data free.

FMR data updates annually in October for the following fiscal year. Set your cache TTL to 365 days and re-fetch in the first week of October each year.

Combining Multiple APIs for Richer Pages

The real competitive advantage of government data is not any single source - it is what happens when you combine them. Individual APIs give you data points. Combining them gives you analysis that no single-source competitor can replicate.

Consider what a well-built "Austin TX housing market" section can say when all six sources are integrated:

"The Austin-Round Rock MSA median home value is approximately $425,000 (Census B25077, 2022 ACS). Per FHFA, values in this metro have risen 22% over the past 5 years - above the national average of 18%. With median household income at $74,000 (Census B19013), Austin homeowners spend roughly 32% of gross income on housing costs - above the 28% threshold financial planners typically recommend. For context, the HUD Fair Market Rent for a 3BR in this metro is $1,842/month. Local plumbing labor runs approximately $28/hour (BLS OEWS), about 27% above the national median, which pushes renovation costs higher than in most peer cities."

Every sentence in that paragraph comes from a different government source. None of it can be produced without API integration. All of it is verifiable, authoritative, and updated on a predictable schedule. That is the content quality that earns links and holds rankings.

Here is the cross-source data flow for a standard city guide page:

Content Section Primary Source Supporting Source Output
Housing stock overview Census B25034 (year built) Census B25040 (heating fuel) Age distribution + dominant fuel type
Market value context Census B25077 (median value) FHFA HPI (5-year change) Current value + appreciation trend
Affordability analysis Census B19013 (income) HUD FMR (rent benchmark) Housing cost-to-income ratio + rent vs buy
Contractor cost context BLS OEWS (trade wages) - Local vs national wage premium
Maintenance calendar NOAA NORMAL_MLY USDA hardiness zone Month-by-month task schedule

Rate Limit Management Strategy

Running all six sources across 1,000 cities requires careful rate limit planning. Here is the realistic call volume for a first-time full data pull:

  • Census ACS: Roughly 35 API calls per city (one per table per geography level, batched where possible). For 1,000 cities: approximately 35,000 calls. With the free API key (50/min), this takes about 12 hours of continuous operation. Spread over 2 days to stay well under limits.
  • NOAA: One station lookup + one normals pull per city. For 1,000 cities: approximately 5,000 calls. Well within the 10,000/day limit - completable in a single day.
  • BLS OEWS: One call per metro area (not per city - many cities share an MSA). For 1,000 cities spanning roughly 200 unique MSAs: approximately 200 calls. Trivial.
  • FHFA: Single CSV download, parsed into a local database. Zero ongoing API calls.
  • USDA: Single CSV download, stored as a lookup table. Zero ongoing API calls.
  • HUD FMR: One call per county. For 1,000 cities across roughly 400 unique counties: approximately 400 calls. Complete in under an hour.

Total meaningful API call volume for a fresh 1,000-city run is under 41,000 calls spread across the sources with the highest per-city volume. Manageable over 2-3 days with appropriate per-source throttling built into your pipeline.

Cache aggressively after the initial pull. Census data does not change more than once a year. NOAA normals change every decade. BLS wages update in May. HUD FMR updates in October. Build TTLs into your data store and schedule re-fetches around the actual release calendars rather than on arbitrary intervals.

What to Do When Data Is Missing

Government data has gaps. Census ACS suppresses values for small geographies where the sample size is too small to produce reliable estimates - you will see this often for ZIPs with populations under 2,000. NOAA coverage is thinner in rural areas where weather stations are sparse. BLS OEWS does not report wages for occupations with fewer than 10 establishments in an MSA.

The right fallback strategy, in order of preference:

  1. County-level data - most Census tables available at ZIP level are also available at county level with far fewer suppressions. If a ZIP lacks B25077, use the county median home value with a note: "County-level data used for this estimate."
  2. State average - when county data is also unavailable or unreliable. "Based on state averages, homes in this area..." is honest and still useful.
  3. MSA-level fallback - for BLS OEWS gaps, use the nearest reported MSA or the state-level OEWS estimate.
  4. Explicit disclosure - if the gap is large, say so: "Detailed housing data is limited for this area due to small sample sizes. We show the county average, which may not reflect this specific neighborhood."

Never leave a section blank or generate fake precision from missing data. A page that says "county-level data used" is more credible than one that quietly omits the section or fills it with national averages presented as local facts. Readers and Google both notice the difference.

Homeowner.wiki integrates all 6 of these sources and handles rate limits, caching, suppression fallbacks, and data freshness automatically. You get clean, normalized data fields - no API juggling required.

Stop Managing 6 Different API Integrations

Census authentication, NOAA station lookups, BLS series ID construction, FHFA CSV parsing, USDA lookup tables, HUD tokens - Homeowner.wiki handles all of it. Join the waitlist and get access to clean, ready-to-use government data for every city in the US.

Join the Waitlist