← All writing
March 29, 2026
GIS
SecurityITArcGISOAuthVercelAIGIS Development
gis

When IT Says No — A GIS Developer's Response Framework

Every GIS developer who has built something custom inside a large organization has had this conversation. You've built a tool that works. People are using it. Then IT finds out and the questions start.

The objections are almost always the same five. The answers are technical, specific, and usually satisfying — if you know what to say.


The Context

Custom GIS tools built with the ArcGIS JavaScript SDK, deployed to platforms like Vercel, and integrated with AI APIs represent a newer pattern than IT departments are accustomed to reviewing. The comparison they make is usually to no-code platforms — Experience Builder, Dashboards, Survey123 — which already have organizational approval.

The case you're making isn't that the custom stack is risk-free. It's that the risk profile is comparable to what's already approved, and in some areas more favorable.


Objection 1: "You're not using approved software."

The response: The authentication layer is identical.

Custom ArcGIS SDK applications use Esri's OAuth 2.0 implementation — the same mechanism used by Experience Builder, Web AppBuilder, and every Esri native application. There is no custom authentication code. User credentials are entered on arcgis.com, never on the custom application. The app receives a time-limited token scoped to the logged-in user's existing permissions. Token expiry and revocation are managed entirely by ArcGIS Online.

From a credential security standpoint, the custom app is indistinguishable from a native Esri application. It uses the same auth server, the same token format, and the same permission model.


Objection 2: "We can't audit your code."

The response: You can audit this code better than the alternative.

Experience Builder app configuration is distributed across an AGOL item, a draft config, a published config, data source bindings, Arcade expressions in multiple contexts, and sharing settings. Understanding what an EB app does requires navigating an interface that wasn't designed for reading.

A custom SDK application is a file — or a small set of files — in a Git repository. Every line is visible. Every function is named. Every data source is declared explicitly with its URL, its field list, and its query logic. A code reviewer with basic web development knowledge can read it from top to bottom in an afternoon.

Custom code is more auditable than configuration, not less. The audit surface is a repository. Offer to walk IT through it.


Objection 3: "Vercel isn't an approved host."

The response: Review the SOC 2 documentation, then compare it to what's already approved.

Vercel holds SOC 2 Type II certification. HTTPS is enforced on all deployments. Environment variables (API keys, credentials) are server-side only and never exposed in client-facing code. The deployment pipeline has an audit trail. There are no servers to patch — the infrastructure is managed by Vercel.

The comparison to make: what is the approved alternative hosting? On-premise servers typically have longer patch cycles, manual SSL management, and less rigorous access logging than a purpose-built cloud platform. The question isn't whether Vercel is perfect — it's whether it's materially worse than what's already in use.


Objection 4: "Sending data to an AI API is a data governance violation."

The response: Define what's actually being sent.

This is the objection that deserves the most precise answer, because it depends entirely on implementation.

The correct architecture for AI integration in GIS tools: the AI receives schema metadata — field names, field types, value ranges — not feature data. A prompt that says "this layer has fields GIFT_DATE (date), GIFT_AMOUNT (float), DONOR_ID (integer)" contains no PII. The field names are not sensitive. The data those fields hold is.

If the implementation sends actual attribute values to an external API, that is a legitimate governance concern and should be addressed before deployment. If it sends only schema, the concern evaporates. Know which one you've built, and be able to demonstrate it.

For cases where real data must be analyzed by AI, the answer is a local model via Ollama — same capability, zero data leaving the network. That's worth having in your back pocket.


Objection 5: "Your app could corrupt or breach AGOL data."

The response: The permission model prevents it architecturally.

A custom SDK application can only do what the logged-in user's AGOL account can do. If that account has read-only access to a layer, the app has read-only access. If the account doesn't have permission to edit a feature service, the app cannot edit it. The permission boundary is enforced by AGOL, not by the application.

The attack surface is the AGOL account itself — the same attack surface that exists when that user logs into the AGOL portal directly. The custom app adds no new exposure. A compromised AGOL account is a compromised AGOL account whether the vector is the native portal or a custom application.


What Actually Matters

The legitimate additional responsibilities that come with custom tooling:

Dependency management. npm packages can introduce vulnerabilities. npm audit should be part of the build process. Packages should be reviewed before they're added. This is a standard expectation of any web development project.

Credential hygiene. API keys must live in environment variables, never in code. .env files must be in .gitignore before the first commit. This is non-negotiable and not difficult.

OAuth app registration. The redirect URI for the AGOL OAuth app registration should be locked to the exact production domain. No wildcards. One domain, one registration.

These aren't objections — they're implementation requirements. Meeting them is the developer's responsibility, not IT's.


The Conversation to Have

The most productive IT conversation isn't a defense. It's a structured review offer:

  1. Code review — here's the repository, here's the architecture, let's walk through it
  2. Data flow documentation — here's exactly what data goes where, including the AI path
  3. OAuth app audit — here's the AGOL registration, here's the redirect URI, here's the token scope
  4. Hosting compliance — here's Vercel's SOC 2, here's how it compares to the existing standard

Organizations that do this review almost always reach the same conclusion: the custom stack is auditable, the security model is sound, and the tooling enables capabilities that no-code platforms don't provide. The conversation just needs to happen before deployment, not after.


The GIS developers who navigate this well aren't the ones who avoid IT. They're the ones who show up with documentation, speak the security vocabulary, and demonstrate that they've already thought through the objections. That's a different conversation than "it works, trust me."

It also makes the next tool easier to deploy.