I work with sensitive humanitarian data every day. Disaster client records. Donor profiles. Shelter locations for populations that need to stay hidden from people actively looking for them.
This shapes every technical decision I make. Not as an afterthought — as the first constraint, before anything else is decided.
The hardware key
My primary authentication device is a YubiKey — a physical USB security token. It plugs into my machine. Nothing authenticates without it in the slot.
This is not standard practice at most nonprofits. It is standard practice at banks and anyone who has lived through a credential breach.
The threat model is simple: passwords are phishable. MFA codes sent to your phone are interceptable via SIM swap. Hardware keys are not. The private key never leaves the device. There is no server to intercept. There is no SMS to reroute. You either have the physical key or you don't.
Every account that matters — GitHub, Vercel, ArcGIS Online, Google — requires it. The friction of plugging in a USB device is the cheapest security investment I've ever made.
Geomasking
The most dangerous thing in GIS work is a precise point.
A precise point is a home address. A shelter location. A domestic violence survivor's intake location. Data that, if exported unmasked, is immediately actionable by bad actors.
Geomasking is the practice of deliberately degrading geographic precision to protect individuals while preserving the analytical value of the data.
Random displacement. Points are moved a random distance — say, 500 to 2,000 meters — in a random direction from the true location. The distribution of points still supports regional analysis. No individual point reveals a true address.
Donut masking. A variant with a minimum radius enforced. The displaced point cannot land closer than 250 meters to the true location, so no "lucky" displacement produces a near-accurate result. Standard in public health research.
Grid aggregation. Instead of points, data is reported at the census tract, ZIP code, or county level. Individual precision is gone. Pattern analysis remains. This is the right approach for any public-facing map showing client volume by geography.
K-anonymity. A cell is suppressed — hidden or merged — unless it contains at least K records, typically 5 or 10. Single-point cells never appear. Used for any demographic breakdown that could identify individuals in sparse areas.
The rule I follow without exception: if a point on a map could lead someone to a specific household, it does not appear on any public or org-shared layer. It lives in a private group-restricted layer that requires AGOL login and explicit group membership.
I've had the conversation with IT security. The answer is always the same: geomasking is not optional for client data. It is the minimum standard.
Zero credentials in code
Every credential — API keys, client secrets, tokens — lives in environment variables. Never in code. Never in a comment. Never in a commit message.
This is enforced at the .gitignore level before a single line is written. .env files are listed globally. The habit is pre-commit, not post-breach.
The reason is irreversible: once a credential is committed to a public repo, it is exposed. GitHub's secret scanning catches some of it. Automated bots that index new commits catch more of it, faster. By the time you notice and rotate the key, it may already be used.
Rotating a compromised AGOL client secret requires IT involvement, a change management ticket, and notifying everyone whose apps use that client ID. One .gitignore omission can cost a week of remediation.
The thin HTML security model
The thin HTML deploy pattern I use is not just fast — it is the most secure web architecture I've found for this kind of work.
There is no server. There is no database. There is nothing to breach beyond what your AGOL org already exposes. The app is a static file. The only attack surface is the auth layer — AGOL OAuth 2.0 — which is Esri's problem to maintain, not mine.
Private layers stay private. A feature layer shared only within your org stays invisible to anyone without an AGOL login. The HTML file reveals nothing. No credentials. No data. No structure beyond the UI.
In a compliance conversation, this is a meaningful statement: "Our custom map tool has no server, no credentials, and relies entirely on ArcGIS Online's OAuth 2.0 and layer-level sharing controls." IT can evaluate that. They already evaluated AGOL. This is AGOL.
Blast radius thinking
When I build something, I ask: if this were breached, what is the worst case?
Thin HTML app, public data only: worst case is someone reads what anyone can already read. Acceptable.
Thin HTML app, org-private data: worst case is someone with an org account sees org data they could see anyway by navigating AGOL directly. The thin app adds no exposure.
Thin HTML app, group-private data: worst case requires a compromised account inside that specific group. Contained to a defined population.
Server with database: worst case is the database. This is why I don't build servers for humanitarian data unless the requirement is impossible to meet otherwise.
The pattern: minimize blast radius at every layer, then accept the residual risk consciously and explicitly. Not defensively. Consciously.
The actual checklist
Before any tool touches real data:
- Auth model confirmed — OAuth, not username/password
- Layer sharing settings audited — who can see this if they know the URL?
- Geomasking applied to any client-level point data
.envfor credentials,.gitignoreconfirmed before first commit- Private repo until reviewed for public data content
- YubiKey required for any account with write access
After the tool is live:
- No credentials in code — automated scan before merge
- No PII in git history — filter-repo if anything slipped through
- Layer permissions reviewed on a regular cadence
This sounds like a lot of overhead for small tools. It is. That is the point. Small tools at humanitarian organizations hold data that belongs to people in crisis. The cost of a breach is not proportional to the size of the tool.
The YubiKey is not paranoia. It is the minimum bar for anyone who holds access to data that belongs to people who trusted an organization to keep it safe.
Related: OAuth Without the Pain — The Thin HTML Deploy Pattern