Unfolding the universe of possibilities..

Whispers from the digital wind, hang tight..

A Python Tool for Fetching Air Pollution Data from Google Maps Air Quality APIs

Learn how to fetch rich, real time air quality data from all over the world

This article details how we can use the Google Maps Air Quality APIs in Python to fetch and explore live air pollution data, time series and maps. Check out the full code here.

1. Background

In August 2023, Google announced the addition of an air quality service to its list of mapping APIs. You can read more about that here. It appears this information is now also available from within the Google Maps app, though the data obtainable via the APIs turned out to be much richer.

According the announcement, Google is combining information from many sources at different resolutions — ground-based pollution sensors, satellite data, live traffic information and predictions from numerical models — to produce a dynamically updated dataset of air quality in 100 countries at up to 500m resolution. This sounds like a very interesting and potentially useful dataset for all sorts of mapping, healthcare and planning applications!

When first reading about this I was planning to try it out in a “talk to your data” application, using some of the things learned from building this travel mapper tool. Maybe a system that can plot a time series of air pollution concentrations in your favorite city, or perhaps a tool to help people plan hikes in their local area as to avoid bad air?

There are three API tools that can help here — a “current conditions” service, which provides current air quality index values and pollutant concentrations at a given location; a “historical conditions” service, which does the same but at hourly intervals for up to 30 days in the past and a “heatmap” service, which provides current conditions over a given area as an image.

Previously, I had used the excellent googlemapspackage to call Google Maps APIs in Python, but these new APIs are not yet supported. Surprisingly, beyond the official documentation I could find few examples of people using these new tools and no pre-existing Python packages designed to call them. I would be happily corrected though if someone knows otherwise!

I therefore built some quick tools of my own, and in this post we walk through how they work and how to use them. I hope this will be useful to anyone wanting to experiment with these new APIs in Python and looking for a place to start. All the code for this project can be found here, and I’ll likely be expanding this repo over time as I add more functionality and build some sort of mapping application with the air quality data.

2. Get the current air quality at a given location

Let’s get started! In this section we’ll go over how to fetch air quality data at a given location with Google Maps. You’ll first need an API key, which you can generate via your Google Cloud account. They have a 90-day free trial period, after which you’ll pay for API services you use. Make sure you enable the “Air Quality API”, and be aware of the pricing policies before you start making a lot of calls!

Screenshot of the Google Cloud API library, from where you can activate the air quality API. Image generated by the author.

I usually store my API key in an .env file and load it with dotenv using a function like this

from dotenv import load_dotenv
from pathlib import Path

def load_secets():
load_dotenv()
env_path = Path(“.”) / “.env”
load_dotenv(dotenv_path=env_path)

google_maps_key = os.getenv(“GOOGLE_MAPS_API_KEY”)

return {
“GOOGLE_MAPS_API_KEY”: google_maps_key,
}

Getting current conditions requires a POST request as detailed here. We’re going to take inspiration from the googlemaps package to do this in a way that can be generalized. First, we build a client class that uses requests to make the call. The goal is quite straightforward — we want to build a URL like the one below, and include all the request options specific to the user’s query.

https://airquality.googleapis.com/v1/currentConditions:lookup?key=YOUR_API_KEY

The Clientclass takes in our API key as key and then builds the request_url for the query. It accepts request options as a params dictionary and then puts them in the JSON request body, which is handled by the self.session.post() call.

import requests
import io

class Client(object):
DEFAULT_BASE_URL = “https://airquality.googleapis.com”

def __init__(self, key):
self.session = requests.Session()
self.key = key

def request_post(self, url, params):
request_url = self.compose_url(url)
request_header = self.compose_header()
request_body = params

response = self.session.post(
request_url,
headers=request_header,
json=request_body,
)

return self.get_body(response)

def compose_url(self, path):
return self.DEFAULT_BASE_URL + path + “?” + “key=” + self.key

@staticmethod
def get_body(response):
body = response.json()

if “error” in body:
return body[“error”]

return body

@staticmethod
def compose_header():
return {
“Content-Type”: “application/json”,
}

Now we can make a function that helps the user assemble valid request options for the current conditions API and then uses this Client class to make the request. Again, this is inspired by the design of the googlemaps package.

def current_conditions(
client,
location,
include_local_AQI=True,
include_health_suggestion=False,
include_all_pollutants=True,
include_additional_pollutant_info=False,
include_dominent_pollutant_conc=True,
language=None,
):
“””
See documentation for this API here
https://developers.google.com/maps/documentation/air-quality/reference/rest/v1/currentConditions/lookup
“””
params = {}

if isinstance(location, dict):
params[“location”] = location
else:
raise ValueError(
“Location argument must be a dictionary containing latitude and longitude”
)

extra_computations = []
if include_local_AQI:
extra_computations.append(“LOCAL_AQI”)

if include_health_suggestion:
extra_computations.append(“HEALTH_RECOMMENDATIONS”)

if include_additional_pollutant_info:
extra_computations.append(“POLLUTANT_ADDITIONAL_INFO”)

if include_all_pollutants:
extra_computations.append(“POLLUTANT_CONCENTRATION”)

if include_dominent_pollutant_conc:
extra_computations.append(“DOMINANT_POLLUTANT_CONCENTRATION”)

if language:
params[“language”] = language

params[“extraComputations”] = extra_computations

return client.request_post(“/v1/currentConditions:lookup”, params)

The options for this API are relatively straightforward. It needs a dictionary with the longitude and latitude of the point you want to investigate, and can optionally take in various other arguments that control how much information is returned. Lets see it in action with all the arguments set to True

# set up client
client = Client(key=GOOGLE_MAPS_API_KEY)
# a location in Los Angeles, CA
location = {“longitude”:-118.3,”latitude”:34.1}
# a JSON response
current_conditions_data = current_conditions(
client,
location,
include_health_suggestion=True,
include_additional_pollutant_info=True
)

A lot of interesting information is returned! Not only do we have the air quality index values from the Universal and US-based AQI indices, but we also have concentrations of the major pollutants, a description of each one and an overall set of health recommendations for the current air quality.

{‘dateTime’: ‘2023-10-12T05:00:00Z’,
‘regionCode’: ‘us’,
‘indexes’: [{‘code’: ‘uaqi’,
‘displayName’: ‘Universal AQI’,
‘aqi’: 60,
‘aqiDisplay’: ’60’,
‘color’: {‘red’: 0.75686276, ‘green’: 0.90588236, ‘blue’: 0.09803922},
‘category’: ‘Good air quality’,
‘dominantPollutant’: ‘pm10’},
{‘code’: ‘usa_epa’,
‘displayName’: ‘AQI (US)’,
‘aqi’: 39,
‘aqiDisplay’: ’39’,
‘color’: {‘green’: 0.89411765},
‘category’: ‘Good air quality’,
‘dominantPollutant’: ‘pm10’}],
‘pollutants’: [{‘code’: ‘co’,
‘displayName’: ‘CO’,
‘fullName’: ‘Carbon monoxide’,
‘concentration’: {‘value’: 292.61, ‘units’: ‘PARTS_PER_BILLION’},
‘additionalInfo’: {‘sources’: ‘Typically originates from incomplete combustion of carbon fuels, such as that which occurs in car engines and power plants.’,
‘effects’: ‘When inhaled, carbon monoxide can prevent the blood from carrying oxygen. Exposure may cause dizziness, nausea and headaches. Exposure to extreme concentrations can lead to loss of consciousness.’}},
{‘code’: ‘no2’,
‘displayName’: ‘NO2’,
‘fullName’: ‘Nitrogen dioxide’,
‘concentration’: {‘value’: 22.3, ‘units’: ‘PARTS_PER_BILLION’},
‘additionalInfo’: {‘sources’: ‘Main sources are fuel burning processes, such as those used in industry and transportation.’,
‘effects’: ‘Exposure may cause increased bronchial reactivity in patients with asthma, lung function decline in patients with Chronic Obstructive Pulmonary Disease (COPD), and increased risk of respiratory infections, especially in young children.’}},
{‘code’: ‘o3’,
‘displayName’: ‘O3’,
‘fullName’: ‘Ozone’,
‘concentration’: {‘value’: 24.17, ‘units’: ‘PARTS_PER_BILLION’},
‘additionalInfo’: {‘sources’: ‘Ozone is created in a chemical reaction between atmospheric oxygen, nitrogen oxides, carbon monoxide and organic compounds, in the presence of sunlight.’,
‘effects’: ‘Ozone can irritate the airways and cause coughing, a burning sensation, wheezing and shortness of breath. Additionally, ozone is one of the major components of photochemical smog.’}},
{‘code’: ‘pm10’,
‘displayName’: ‘PM10’,
‘fullName’: ‘Inhalable particulate matter (<10µm)’,
‘concentration’: {‘value’: 44.48, ‘units’: ‘MICROGRAMS_PER_CUBIC_METER’},
‘additionalInfo’: {‘sources’: ‘Main sources are combustion processes (e.g. indoor heating, wildfires), mechanical processes (e.g. construction, mineral dust, agriculture) and biological particles (e.g. pollen, bacteria, mold).’,
‘effects’: ‘Inhalable particles can penetrate into the lungs. Short term exposure can cause irritation of the airways, coughing, and aggravation of heart and lung diseases, expressed as difficulty breathing, heart attacks and even premature death.’}},
{‘code’: ‘pm25’,
‘displayName’: ‘PM2.5’,
‘fullName’: ‘Fine particulate matter (<2.5µm)’,
‘concentration’: {‘value’: 11.38, ‘units’: ‘MICROGRAMS_PER_CUBIC_METER’},
‘additionalInfo’: {‘sources’: ‘Main sources are combustion processes (e.g. power plants, indoor heating, car exhausts, wildfires), mechanical processes (e.g. construction, mineral dust) and biological particles (e.g. bacteria, viruses).’,
‘effects’: ‘Fine particles can penetrate into the lungs and bloodstream. Short term exposure can cause irritation of the airways, coughing and aggravation of heart and lung diseases, expressed as difficulty breathing, heart attacks and even premature death.’}},
{‘code’: ‘so2’,
‘displayName’: ‘SO2’,
‘fullName’: ‘Sulfur dioxide’,
‘concentration’: {‘value’: 0, ‘units’: ‘PARTS_PER_BILLION’},
‘additionalInfo’: {‘sources’: ‘Main sources are burning processes of sulfur-containing fuel in industry, transportation and power plants.’,
‘effects’: ‘Exposure causes irritation of the respiratory tract, coughing and generates local inflammatory reactions. These in turn, may cause aggravation of lung diseases, even with short term exposure.’}}],
‘healthRecommendations’: {‘generalPopulation’: ‘With this level of air quality, you have no limitations. Enjoy the outdoors!’,
‘elderly’: ‘If you start to feel respiratory discomfort such as coughing or breathing difficulties, consider reducing the intensity of your outdoor activities. Try to limit the time you spend near busy roads, construction sites, open fires and other sources of smoke.’,
‘lungDiseasePopulation’: ‘If you start to feel respiratory discomfort such as coughing or breathing difficulties, consider reducing the intensity of your outdoor activities. Try to limit the time you spend near busy roads, industrial emission stacks, open fires and other sources of smoke.’,
‘heartDiseasePopulation’: ‘If you start to feel respiratory discomfort such as coughing or breathing difficulties, consider reducing the intensity of your outdoor activities. Try to limit the time you spend near busy roads, construction sites, industrial emission stacks, open fires and other sources of smoke.’,
‘athletes’: ‘If you start to feel respiratory discomfort such as coughing or breathing difficulties, consider reducing the intensity of your outdoor activities. Try to limit the time you spend near busy roads, construction sites, industrial emission stacks, open fires and other sources of smoke.’,
‘pregnantWomen’: ‘To keep you and your baby healthy, consider reducing the intensity of your outdoor activities. Try to limit the time you spend near busy roads, construction sites, open fires and other sources of smoke.’,
‘children’: ‘If you start to feel respiratory discomfort such as coughing or breathing difficulties, consider reducing the intensity of your outdoor activities. Try to limit the time you spend near busy roads, construction sites, open fires and other sources of smoke.’}}

3. Get a timeseries of air quality at a given location

Wouldn’t it be nice to be able to fetch a timeseries of these AQI and pollutant values for a given location? That might reveal interesting patterns such as correlations between the pollutants or daily fluctuations caused by traffic or weather-related factors.

We can do this with another POST request to the historical conditions API, which will give us an hourly history. This works in much the same way as current conditions, the only major difference being that since the results can be quite long they are returned as several pages , which requires a little extra logic to handle.

Let’s modify the request_post method of Client to handle this.

def request_post(self,url,params):

request_url = self.compose_url(url)
request_header = self.compose_header()
request_body = params

response = self.session.post(
request_url,
headers=request_header,
json=request_body,
)

response_body = self.get_body(response)

# put the first page in the response dictionary
page = 1
final_response = {
“page_{}”.format(page) : response_body
}
# fetch all the pages if needed
while “nextPageToken” in response_body:
# call again with the next page’s token
request_body.update({
“pageToken”:response_body[“nextPageToken”]
})
response = self.session.post(
request_url,
headers=request_header,
json=request_body,
)
response_body = self.get_body(response)
page += 1
final_response[“page_{}”.format(page)] = response_body

return final_response

This handles the case where response_body contains a field called nextPageToken, which is the id of the next page of data that’s been generated and is ready to fetch. Where that information exists, we just need to call the API again with a new param called pageToken , which directs it to the relevant page. We do this repeatedly in a while loop until there are no more pages left. Our final_response dictionary therefore now contains another layer denoted by page number. For calls to current_conditions there will only ever be one page, but for calls to historical_conditions there may be several.

With that taken care of, we can write a historical_conditions function in a very similar style to current_conditions .

def historical_conditions(
client,
location,
specific_time=None,
lag_time=None,
specific_period=None,
include_local_AQI=True,
include_health_suggestion=False,
include_all_pollutants=True,
include_additional_pollutant_info=False,
include_dominant_pollutant_conc=True,
language=None,
):
“””
See documentation for this API here https://developers.google.com/maps/documentation/air-quality/reference/rest/v1/history/lookup
“””
params = {}

if isinstance(location, dict):
params[“location”] = location
else:
raise ValueError(
“Location argument must be a dictionary containing latitude and longitude”
)

if isinstance(specific_period, dict) and not specific_time and not lag_time:
assert “startTime” in specific_period
assert “endTime” in specific_period

params[“period”] = specific_period

elif specific_time and not lag_time and not isinstance(specific_period, dict):
# note that time must be in the “Zulu” format
# e.g. datetime.datetime.strftime(datetime.datetime.now(),”%Y-%m-%dT%H:%M:%SZ”)
params[“dateTime”] = specific_time

# lag periods in hours
elif lag_time and not specific_time and not isinstance(specific_period, dict):
params[“hours”] = lag_time

else:
raise ValueError(
“Must provide specific_time, specific_period or lag_time arguments”
)

extra_computations = []
if include_local_AQI:
extra_computations.append(“LOCAL_AQI”)

if include_health_suggestion:
extra_computations.append(“HEALTH_RECOMMENDATIONS”)

if include_additional_pollutant_info:
extra_computations.append(“POLLUTANT_ADDITIONAL_INFO”)

if include_all_pollutants:
extra_computations.append(“POLLUTANT_CONCENTRATION”)

if include_dominant_pollutant_conc:
extra_computations.append(“DOMINANT_POLLUTANT_CONCENTRATION”)

if language:
params[“language”] = language

params[“extraComputations”] = extra_computations
# page size default set to 100 here
params[“pageSize”] = 100
# page token will get filled in if needed by the request_post method
params[“pageToken”] = “”

return client.request_post(“/v1/history:lookup”, params)

To define the historical period, the API can accept a lag_time in hours, up to 720 (30 days). It can also accept a specific_perioddictionary, with defines start and end times in the format described in the comments above. Finally, to fetch a single hour of data, it can accept just one timestamp, provided by specific_time . Also note the use of the pageSize parameter, which controls how many time points are returned in each call to the API. The default here is 100.

Let’s try it out.

# set up client
client = Client(key=GOOGLE_MAPS_API_KEY)
# a location in Los Angeles, CA
location = {“longitude”:-118.3,”latitude”:34.1}
# a JSON response
history_conditions_data = historical_conditions(
client,
location,
lag_time=720
)

We should get a long, nested JSON response that contains the AQI index values and specific pollutant values at 1 hour increments over the last 720 hours. There are many ways to format this into a structure that’s more amenable to visualization and analysis, and the function below will convert it into a pandas dataframe in “long” format, which works well with seabornfor plotting.

from itertools import chain
import pandas as pd

def historical_conditions_to_df(response_dict):

chained_pages = list(chain(*[response_dict[p][“hoursInfo”] for p in [*response_dict]]))

all_indexes = []
all_pollutants = []
for i in range(len(chained_pages)):
# need this check in case one of the timestamps is missing data, which can sometimes happen
if “indexes” in chained_pages[i]:
this_element = chained_pages[i]
# fetch the time
time = this_element[“dateTime”]
# fetch all the index values and add metadata
all_indexes += [(time , x[“code”],x[“displayName”],”index”,x[“aqi”],None) for x in this_element[‘indexes’]]
# fetch all the pollutant values and add metadata
all_pollutants += [(time , x[“code”],x[“fullName”],”pollutant”,x[“concentration”][“value”],x[“concentration”][“units”]) for x in this_element[‘pollutants’]]

all_results = all_indexes + all_pollutants
# generate “long format” dataframe
res = pd.DataFrame(all_results,columns=[“time”,”code”,”name”,”type”,”value”,”unit”])
res[“time”]=pd.to_datetime(res[“time”])
return res

Running this on the output of historical_conditions will produce a dataframe whose columns are formatted for easy analysis.

df = historical_conditions_to_df(history_conditions_data)Example dataframe of historical AQI data, ready for plotting

And we can now plot the result in seaborn or some other visualization tool.

import seaborn as sns
g = sns.relplot(
x=”time”,
y=”value”,
data=df[df[“code”].isin([“uaqi”,”usa_epa”,”pm25″,”pm10″])],
kind=”line”,
col=”name”,
col_wrap=4,
hue=”type”,
height=4,
facet_kws={‘sharey’: False, ‘sharex’: False}
)
g.set_xticklabels(rotation=90)Universal AQI, US AQI, pm25 and pm10 values for this location in LA over a 30 day period. Image generated the the author.

This is already very interesting! There are clearly several periodicities in the pollutant time series and it’s notable that the US AQI is closely correlated with the pm25 and pm10 concentrations, as expected. I am much less familiar with the Universal AQI that Google is providing here, so can’t explain why appears anti-correlated with pm25 and p10. Does smaller UAQI mean better air quality? Despite some searching around I’ve been unable to find a good answer.

4. Get air quality heatmap tiles

Now for the final use case of the Google Maps Air Quality API — generating heatmap tiles. The documentation about this a sparse, which is a shame because these tiles are a powerful tool for visualizing current air quality, especially when combined with a Folium map.

We fetch them with a GET request, which involves building a URL in the following format, where the location of the tile is specified by zoom , x and y

GET https://airquality.googleapis.com/v1/mapTypes/{mapType}/heatmapTiles/{zoom}/{x}/{y}

What dozoom , x and y mean? We can answe this by learning about how Google Maps converts coordinates in latitude and longitude into “tile coordinates”, which is described in detail here. Essentially, Google Maps is storing imagery in grids where each cell measures 256 x 256 pixels and the real-world dimensions of the cell are a function of the zoom level. When we make a call to the API, we need to specify which grid to draw from — which is determined by the zoom level — and where on the grid to draw from — which is determined by the x and y tile coordinates. What comes back is a bytes array that can be read by Python Imaging Library (PIL) or similiar imaging processing package.

Having formed our url in the above format, we can add a few methods to the Client class that will allow us to fetch the corresponding image.

def request_get(self,url):

request_url = self.compose_url(url)
response = self.session.get(request_url)

# for images coming from the heatmap tiles service
return self.get_image(response)

@staticmethod
def get_image(response):

if response.status_code == 200:
image_content = response.content
# note use of Image from PIL here
# needs from PIL import Image
image = Image.open(io.BytesIO(image_content))
return image
else:
print(“GET request for image returned an error”)
return None

This is good, but we what we really need is the ability to convert a set of coordinates in longitude and latitude into tile coordinates. The documentation explains how — we first convert to coordinates into the Mercator projection, from which we convert to “pixel coordinates” using the specified zoom level. Finally we translate that into the tile coordinates. To handle all these transformations, we can use the TileHelper class below.

import math
import numpy as np

class TileHelper(object):

def __init__(self, tile_size=256):

self.tile_size = tile_size

def location_to_tile_xy(self,location,zoom_level=4):

# Based on function here
# https://developers.google.com/maps/documentation/javascript/examples/map-coordinates#maps_map_coordinates-javascript

lat = location[“latitude”]
lon = location[“longitude”]

world_coordinate = self._project(lat,lon)
scale = 1 << zoom_level

pixel_coord = (math.floor(world_coordinate[0]*scale), math.floor(world_coordinate[1]*scale))
tile_coord = (math.floor(world_coordinate[0]*scale/self.tile_size),math.floor(world_coordinate[1]*scale/self.tile_size))

return world_coordinate, pixel_coord, tile_coord

def tile_to_bounding_box(self,tx,ty,zoom_level):

# see https://developers.google.com/maps/documentation/javascript/coordinates
# for details
box_north = self._tiletolat(ty,zoom_level)
# tile numbers advance towards the south
box_south = self._tiletolat(ty+1,zoom_level)
box_west = self._tiletolon(tx,zoom_level)
# time numbers advance towards the east
box_east = self._tiletolon(tx+1,zoom_level)

# (latmin, latmax, lonmin, lonmax)
return (box_south, box_north, box_west, box_east)

@staticmethod
def _tiletolon(x,zoom):
return x / math.pow(2.0,zoom) * 360.0 – 180.0

@staticmethod
def _tiletolat(y,zoom):
n = math.pi – (2.0 * math.pi * y)/math.pow(2.0,zoom)
return math.atan(math.sinh(n))*(180.0/math.pi)

def _project(self,lat,lon):

siny = math.sin(lat*math.pi/180.0)
siny = min(max(siny,-0.9999), 0.9999)

return (self.tile_size*(0.5 + lon/360), self.tile_size*(0.5 – math.log((1 + siny) / (1 – siny)) / (4 * math.pi)))

@staticmethod
def find_nearest_corner(location,bounds):

corner_lat_idx = np.argmin([
np.abs(bounds[0]-location[“latitude”]),
np.abs(bounds[1]-location[“latitude”])
])

corner_lon_idx = np.argmin([
np.abs(bounds[2]-location[“longitude”]),
np.abs(bounds[3]-location[“longitude”])
])

if (corner_lat_idx == 0) and (corner_lon_idx == 0):
# closests is latmin, lonmin
direction = “southwest”
elif (corner_lat_idx == 0) and (corner_lon_idx == 1):
direction = “southeast”
elif (corner_lat_idx == 1) and (corner_lon_idx == 0):
direction = “northwest”
else:
direction = “northeast”

corner_coords = (bounds[corner_lat_idx],bounds[corner_lon_idx+2])
return corner_coords, direction

@staticmethod
def get_ajoining_tiles(tx,ty,direction):

if direction == “southwest”:
return [(tx-1,ty),(tx-1,ty+1),(tx,ty+1)]
elif direction == “southeast”:
return [(tx+1,ty),(tx+1,ty-1),(tx,ty-1)]
elif direction == “northwest”:
return [(tx-1,ty-1),(tx-1,ty),(tx,ty-1)]
else:
return [(tx+1,ty-1),(tx+1,ty),(tx,ty-1)]

We can see that location_to_tile_xy is taking in a location dictionary and zoom level and returning the tile in which that point can be found. Another helpful function is tile_to_bounding_box , which will find the bounding coordinates of a specified grid cell. We need this if we’re going to geolocate the cell and plot it on a map.

Lets see how this works inside the air_quality_tile function below, which is going to take in our client , location and a string indicating what type of tile we want to fetch. We also need to specify a zoom level, which can be difficult to choose at first and requires some trial and error. We’ll discuss the get_adjoining_tiles argument shortly.

def air_quality_tile(
client,
location,
pollutant=”UAQI_INDIGO_PERSIAN”,
zoom=4,
get_adjoining_tiles = True

):

# see https://developers.google.com/maps/documentation/air-quality/reference/rest/v1/mapTypes.heatmapTiles/lookupHeatmapTile

assert pollutant in [
“UAQI_INDIGO_PERSIAN”,
“UAQI_RED_GREEN”,
“PM25_INDIGO_PERSIAN”,
“GBR_DEFRA”,
“DEU_UBA”,
“CAN_EC”,
“FRA_ATMO”,
“US_AQI”
]

# contains useful methods for dealing the tile coordinates
helper = TileHelper()

# get the tile that the location is in
world_coordinate, pixel_coord, tile_coord = helper.location_to_tile_xy(location,zoom_level=zoom)

# get the bounding box of the tile
bounding_box = helper.tile_to_bounding_box(tx=tile_coord[0],ty=tile_coord[1],zoom_level=zoom)

if get_adjoining_tiles:
nearest_corner, nearest_corner_direction = helper.find_nearest_corner(location, bounding_box)
adjoining_tiles = helper.get_ajoining_tiles(tile_coord[0],tile_coord[1],nearest_corner_direction)
else:
adjoining_tiles = []

tiles = []
#get all the adjoining tiles, plus the one in question
for tile in adjoining_tiles + [tile_coord]:

bounding_box = helper.tile_to_bounding_box(tx=tile[0],ty=tile[1],zoom_level=zoom)
image_response = client.request_get(
“/v1/mapTypes/” + pollutant + “/heatmapTiles/” + str(zoom) + ‘/’ + str(tile[0]) + ‘/’ + str(tile[1])
)

# convert the PIL image to numpy
try:
image_response = np.array(image_response)
except:
image_response = None

tiles.append({
“bounds”:bounding_box,
“image”:image_response
})

return tiles

From reading the code, we can see that the workflow is as follows: First, find the tile coordinates of the location of interest. This specifies the grid cell we want to fetch. Then, find the bounding coordinates of this grid cell. If we want to fetch the surrounding tiles, find the nearest corner of the bounding box and then use that to calculate the tile coordinates of the three adjacent grid cells. Then call the API and return each of the tiles as an image with its corresponding bounding box.

We can run this in the standard way, as follows:

client = Client(key=GOOGLE_MAPS_API_KEY)
location = {“longitude”:-118.3,”latitude”:34.1}
zoom = 7
tiles = air_quality_tile(
client,
location,
pollutant=”UAQI_INDIGO_PERSIAN”,
zoom=zoom,
get_adjoining_tiles=False)

And then plot with folium for a zoomable map! Note that I’m using leafmap here, because this package can generate Folium maps that are compatible with gradio, a powerful tool for generating simple user interfaces for python applications. Take a look at this article for an example.

import leafmap.foliumap as leafmap
import folium

lat = location[“latitude”]
lon = location[“longitude”]

map = leafmap.Map(location=[lat, lon], tiles=”OpenStreetMap”, zoom_start=zoom)

for tile in tiles:
latmin, latmax, lonmin, lonmax = tile[“bounds”]
AQ_image = tile[“image”]
folium.raster_layers.ImageOverlay(
image=AQ_image,
bounds=[[latmin, lonmin], [latmax, lonmax]],
opacity=0.7
).add_to(map)

Perhaps disappointingly, the tile containing our location at this zoom level is mostly sea, although its still nice to see the air pollution plotted on top of a detailed map. If you zoom in, you can see that road traffic information is being used to inform the air quality signals in urban areas.

Plotting an air quality heatmap tile on top of a Folium map. Image generated by the author.

Setting get_adjoining_tiles=True gives us a much nicer map because it fetches the three closest, non-overlapping tiles at that zoom level. In our case that helps a lot to make the map more presentable.

When we also fetch the adjacent tiles, a much more interesting result is produced. Note that the colors here show the Universal AQI index. Image generated by the author.

I personally prefer the images generated when pollutant=US_AQI, but there are several different options. Unfortunately the API does not return a color scale, although it would be possible to generate one using the pixel values in the image and knowledge of what the colors mean.

The same tiles as above colored according to the US AQI. This map was generated on 10/12/2023 and the bright red spot in central CA appears to be prescribed fire in the hills near Coalinga, according to this tool https://www.frontlinewildfire.com/california-wildfire-map/. Image generated by the author.

Conclusion

Thanks for making it to the end! Here we explored how to use the Google Maps Air Quality APIs to deliver results in Python, which could be used in manner of interesting applications. In future I hope to follow up with another article about the air_quality_mapper tool as it evolves further, but I hope that the scripts discussed here will be useful in their own right. As always, any suggestions for further development would be much appreciated!

A Python Tool for Fetching Air Pollution Data from Google Maps Air Quality APIs was originally published in Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.

Leave a Comment