The Path of Least Surveillance: Route Planning Using AI Agents in a UTS Environment

UTSUbiquitous Technical SurveillancePydnaticAIOpenSphereOpenRouterOpen SourceOSMAI SearchOpenAIAnthropicPrivacySurveillanceFlockFlock SafetyALPRLicense Plate ReaderClaude

Learn how to map CCTV cameras and license plate readers, then develop routing algorithms that prioritize low-visibility paths over speed and efficiency using AI Agents.

Coverage map showing buffered surveillance zones in OpenSphere
Coverage map showing buffered surveillance zones in OpenSphere

TL;DR: Surveillance networks like Flock and VDOT CCTV Cameras make it nearly impossible to navigate without being tracked. We can tackle this by building a custom routing model using AI agents and publicly available data, focusing on routes that minimizes surveillance exposure. This post demonstrates how to use AI Agents to plan multi-leg trips while avoiding known UTS collection points.

Introduction

Northern Virginia is rapidly evolving into a UTS (Ubiquitous Technical Surveillance) environment. With the proliferation of CCTV, license plate readers, and other monitoring technologies throughout the area, privacy-conscious navigation has become increasingly important.

In my previous post on building a UTS environment in Northern Virginia, I demonstrated how to aggregate and visualize CCTV data to track moving buses across Fairfax County. A commenter on that post made an excellent suggestion:

You could reverse the process as well, and figure out if you were the Fairfax Connector driver, is there a route that is not on camera to get from one place to another?

Building on that foundation, I’ll develop a routing model that leverages known UTS collection points to create “low visibility” routes and utilize AI Agents to automate multi-leg trip planning.

Finding Data Sources

In any AI enabled project, the golden rule is “Garbage In, Garbage Out.” Our routing engine is only as smart as the map underneath it (and polygons we generate later on). If our dataset misses a camera, our “low viz” route might walk us straight into a surveillance camera.

To build a navigation system that penalizing routing past cameras, we first have to build a comprehensive list of things we want to avoid with their location. Since there is no single “API for Surveillance,” ( not a bad idea ) so we have to get creative with our sources.

1. Explicit Cameras: DeFlock & OSM

  • deflock.me — is a community driven project to map Flock camera locations. The data is crowd-sourced and available for download in OSM and as a separate dataset. While it doesn’t cover every camera in DC, it’s a solid start.
  • OpenStreetMap (OSM) — is a treasure trove of geospatial data. Many CCTV cameras are tagged in OSM that we can freely use. We can use the Overpass API to extract camera locations tagged as “surveillance” or “CCTV.” Use the following link to query for OSM tags and download the data in GeoJSON format.

To increase the robustness of our routing, we will assume that certain facilities always have surveillance, even if no one has mapped the specific camera yet.

Using Wikidata and OSM facility tags, we search for:

  1. Post Offices
  2. Banks & ATMs

We could build a more comprehensive list, but this is enough to showcase and test our routing engine. Let’s download all this information so we can upload to OpenSphere, my go-to mapping application for data analysis!

Map of DC surveillance grid in OpenSphere
Visualizing the DC surveillance grid in OpenSphere. The map overlays Banks, Post Offices, Flock Cameras and other Cameras from OSM and DeFlock

Let’s break down what we see in the image above.

  • 💲 Financial Institutions: Locations of Banks and ATMs (High probability of CCTV).

  • 📷 Known Cameras: Confirmed surveillance cameras, ranging from Flock Safety devices to traffic cameras and crowdsourced CCTV tags.

  • ✉️ Postal Services: Distribution centers and retail stores for FedEx, USPS, and UPS.

Now that we have a map populated with locations we want to avoid, it is a good idea to perform some ground truthing. After all, open data is only useful if it represents reality.

Let’s zoom into a camera location in Eastern Market and verify if it actually exists using Google Street View.

Now that we’ve curated our datasets and confirmed the installations, it’s time to merge them into a single GeoJSON of ‘No-Go’ zones.

Coverage Map

Right now, our dataset is just a collection of dots. To route around them effectively, we need to model where these cameras actually see by converting the points into areas of influence.

Generating Buffers

Since most of the data in our dataset lacks specific directional data (Field of View), we need to support 2 types of buffers:

  1. Omnidirectional Buffers: For cameras without directional data, we create circular buffers around each point. A radius of 35 meters is a reasonable estimate for general surveillance coverage in urban areas.
  2. Directional Buffers: For cameras with known orientations (like Flock cameras), we create a 70° Field of View (FOV) cone extending 80 meters out.

We use python to build these buffers using shapely:

def inflate_geometry(geom: BaseGeometry, properties: Dict[str, object]) -> BaseGeometry:
    # Project to meters (EPSG:32618) for accurate measuring
    projected = transform(PROJ_TO_METRIC, geom)

    if projected.geom_type == "Point":
        # Check if we have directional data
        direction = parse_direction(properties)

        # If we know where it's looking, build a Cone
        if direction is not None:
            x, y = projected.coords[0]
            return build_directional_cone(x, y, direction)

        # Otherwise, assume it sees everything (Circle)
        return projected.buffer(POINT_BUFFER_METERS)

Once we have our buffers, we can merge them into a single “coverage” layer that represents all the areas we want to avoid.

There is a massive trade-off here. The size of these buffers is configurable; larger buffers provide more privacy, but they blindly “delete” valid sidewalks from our map. If the buffers are too large in a dense area like Downtown DC, they connect like a chain-link fence, making it impossible for the router to find a path.

We also face the issue of indoor interference. For example, an ATM located deep inside a building still generates a buffer. If that circle extends out into the street, it tricks the router into thinking the sidewalk is watched, effectively blocking a safe path.

Coverage map showing buffered surveillance zones in OpenSphere
Coverage map showing buffered surveillance zones in OpenSphere

In the visualization above, you can see the individual camera points inside the new exclusion polygons (the bubbles). Where the cameras are dense, the bubbles merge, creating large swaths of “forbidden” areas.

Below is the final coverage layer. You can clearly see how individual camera buffers merge into large unroutable blocks. The density is highest exactly where you would expect: the National Mall, the White House, and the federal buildings form a “chain-link” fence.

Final coverage layer showing merged surveillance zones in OpenSphere
Final coverage layer showing merged surveillance zones in OpenSphere

The Routing Model

There are several powerful open-source routing engines available today—such as Valhalla (famously used by Tesla), or OSRM. Each has its own method for calculating costs and applying custom data layers.

However, out of the box, they are mainly focused on efficiency. To make them care about privacy, we have to modify the rules they apply to generating routes.

I wont dive into the specific configuration files for tuning the engine, but I will outline the general architectural logic. If you are interested in your own custom model or have questions please reach out to me to discuss.

  1. Base Map: We start with OpenStreetMap data for Washington, DC, which provides the road and sidewalk network.
  2. Coverage Layer Integration: We integrate our coverage layer into the routing engine as a “cost modifier.” Any edge (road segment) that intersects with our coverage polygons gets a significant cost penalty. For example, if a sidewalk segment falls within a surveillance zone, we assign it a high cost value (e.g., 1000) compared to normal segments (e.g., 1).
  3. Cost Function: The routing engine’s cost function is modified to prioritize paths that avoid high-cost edges. This means that routes passing through surveillance zones become less favorable.

To the algorithm, walking 10 meters past a camera now feels like walking 10 kilometers. It will do everything in its power to find a clean path around the surveillance. But, if the only way to reach the destination is to briefly step into a camera zone, it will do so—choosing the path of least exposure rather than failing completely.

The specific tuning of these weights is where the magic happens. It turns a binary “Yes/No” map into a nuanced navigation system that “feels” the surveillance pressure and flows around it.

This just scratches the surface of what is possible. We can get as fine-grained as we want: penalizing specific turns, applying different weights to different surveillance providers (avoiding private cameras more than traffic cams), or even factoring in time-of-day exposure.

Does it Work?

To test our routing engine, we can compare traditional routes from Google Maps with our “low viz” routes generated by our custom model.

Our test route is a drive from L’Enfant Plaza to The Royal Thai Ambassadors Residence.

  1. Google Maps Route (Blue): The default route takes us straight up 17th St NW, passing directly in front of multiple known cameras.
  2. Low Viz Route (Red): Our custom routing engine suggests a more “scenic” path, diverting onto side streets and avoiding the main thoroughfares where cameras are concentrated.

The low visibility route is longer and less direct, but it successfully avoids the high-surveillance areas identified in our coverage layer. This demonstrates the effectiveness of our approach in generating routes that minimize exposure to surveillance.

What if we want to chain several routes together? Stitching multi-leg routes together by hand is tedious and error prone .. can AI help us?

Using AI Agents to Automate Route Planning

Building a routing model is cool, but it’s still just a simple route calculator. It can get you from A to B, but it can’t plan a trip.

This is where AI agents come in. By leveraging PydanticAI, we can create intelligent agents that understand our privacy-first routing model and can plan complex trips while minimizing surveillance exposure or prioritizing efficiency when needed.

Instead of writing code to calculate a route, I can tell the Agent:

Drive from the International Spy Museum to the Embassy of The Dominican
Republic in Washington DC.

Then bike from the Embassy of The Dominican Republic to the Saint Augustine Catholic
Church. Don't let me get seen on camera for these first 2 routes.

Finally, walk from the Saint Augustine Catholic Church to
1600 Pennsylvania Ave NW, Washington, DC 20500 where I am on camera.

Agent Workflow

We setup a PydanticAI agent with the following tools at its disposal:

  1. find_location: Given a place name, return its lat/lon coordinates. This tool helps the agent identify start and end points for each leg of the trip leveraging the Mapbox Reverse Geocoding API.
  2. get_route: Given start and end coordinates and a routing preference (car, bike, walking or bus), return a route using our custom routing engine. This tool interfaces with our routing engine via REST API to fetch the appropriate path.
  3. upload_to_s3: Upload the generated GeoJSON route to S3 for easy access and visualization in OpenSphere.

I am a huge fan of OpenRouter since it provides access to multiple LLM providers (OpenAI, Anthropic, etc) under a single API. For this project, I used Anthropic’s Claude 4.5 Sonnet model via OpenRouter for its strong Agentic capabilities.

Snippet of the Pydantic Agent setup defining the tools the agent can access:

openrouter_model = OpenRouterModel(
    "anthropic/claude-4.5-sonnet",
    provider=OpenRouterProvider(api_key=openrouter_api_key),
)

map_agent = Agent(
    openrouter_model,
    output_type=AgentOutput,
    system_prompt="""
You are a travel-routing orchestrator that must turn natural-language itineraries into a
downloadable route artifact.

Workflow requirements:
1. Interpret the itinerary chronologically and split it into legs whenever the
transportation mode changes or a new origin/destination pair appears.

........

Always rely on tools for coordinates, routing, joining, and uploading. The presigned URL
is the entire final response.
""")

@map_agent.tool
async def get_route(ctx: RunContext[Deps], route_leg: RouteLeg) -> RouteResponse:

@map_agent.tool
async def upload_to_s3_and_get_url(
    ctx: RunContext[Deps], data: Feature[MultiLineString, dict[str, Any]]
) -> str:

@map_agent.tool
async def join_route_segments(
    ctx: RunContext[Deps], segments: list[RouteResponse]
) -> Feature[MultiLineString, dict[str, Any]]:

....

With the agent configured, we can now provide it with a natural language itinerary:

I need to drive from the International Spy Museum to the Embassy of The Dominican
Republic in Washington DC.

Then bike from the Embassy of The Dominican Republic to the Saint Augustine Catholic
Church. Don't let me get seen on camera for these first 2 routes.

Finally, walk from the Saint Augustine Catholic Church to
1600 Pennsylvania Ave NW, Washington, DC 20500 where I am on camera.

The agent processes this request, breaks it down into individual legs, and uses the tools to fetch routes that minimize surveillance exposure for the first two legs. It then combines these routes into a single GeoJSON file and uploads it to S3.

The final output is a presigned S3 URL where we can download the complete route GeoJSON. This file can then be visualized in OpenSphere to see the entire trip laid out, with the first two legs avoiding surveillance zones as requested.

Whats Next

This project is just the beginning of what’s possible when combining open data, custom routing models, and AI agents. Here are some ideas for future enhancements:

  • High-Fidelity Penalties: Moving beyond binary “avoid” zones to fine-grained logic.
  • Red Cell Your Route: Do you stand out when taking this route compared to Google Maps (Blog Coming Soon!).

If you are interested in geospatial AI, routing engines, or building custom OSINT tools, feel free to reach out.