Is the tenancy application holding us back? AI already knows more about renters than any form, so why are we still asking?
Here’s a thought experiment: imagine you go to your favorite streaming service, and before you can start watching anything, it makes you complete a 7-page PDF about your favorite movie genres, the last five shows you watched, your employment history, whether you have pets, and a copy of your driver’s license.
You’d scream, cancel your subscription, and tweet about it. And yet… that’s exactly what we do every time we try to rent an apartment.
The application form is dead. It just doesn’t know it yet.
In a world where AI can predict what song you’ll vibe with during your commute or approve your credit card in 2 seconds flat, we’re still manually typing out our job titles into fields built for Windows 95. Still uploading the same pay stub for the fifth listing this month. Still faxing(!) documents to landlords boasting about their “AI-powered” property platforms.
What are we doing?
Performative Bureaucracy: A 90s Ritual in a Real-Time World
Let’s get real. The traditional rental application wasn’t designed to be smart. It was designed to be safe. Not for tenants— for landlords, agents, gatekeepers. The five pages of checkboxes? That’s paperwork masquerading as diligence. It's the illusion of control, not the engine of insight.
Just like those restaurants that added QR-code menus and called themselves "tech-enabled" without fixing what actually matters (the kitchen workflow, the 45-minute wait), real estate portals and property managers slap “AI” into their decks while still clinging to PDF forms and reference calls like it's still Clinton-era America.
The truth?
Most rental applications are less informative than your Spotify Wrapped. At least that one captures actual behavior.
Meanwhile, AI is already three steps ahead on every signal that matters.
What Landlords Think They Need, and What Actually Works
Ask a property manager why we still need rental forms, and they’ll give you some version of this:
“It’s risk mitigation. We need to make sure applicants have stable income, a clean history, no red flags.”
Okay, fair. But then why are we still trusting self-reported income from a 2-week-old PDF? Why are we evaluating someone's reliability based on whether they remembered their old landlord’s phone number?
Let’s look at what actually predicts tenant behavior:
- Real-time bank transaction data
- Rent payment patterns across platforms
- Job change frequency, gathered via LinkedIn or payroll APIs
- Bill payment timing — auto-pay regularity vs. scramble-to-pay
- Even zip code mobility churn rates (how often someone moves)
All of that already exists. It’s not science fiction. It’s just not in your application form.
And here’s the kicker: most big landlords already snoop for this stuff unofficially. Google searches. Social media scans. Peeking at LinkedIn. We’re already doing shadow credit assessments—just poorly, and with no accountability.
So why not build actual systems around this digital exhaust instead of continuing the cosplaying paperwork exercise?
The Real Barrier Isn’t Data. It’s Ego.
The problem isn’t that we lack better tools. It’s that the existing process gives landlords a warm fuzzy feeling of paperwork-as-protection. “Equal treatment” becomes “everyone fills out the same form,” even if the form tells us nothing.
In truth, the form is a shield.
It says: “We did our due diligence. We asked standard questions. We followed the rules.”
But here’s where that logic collapses: human gut-checks based on half-baked forms are way more biased — and legally defensible — than AI flagging someone based on subtle behavior patterns. If a landlord rejects you because they had a “bad feeling,” that’s hard to challenge. If an AI model does it based on your zip code or spending habits, expect headlines and lawsuits.
So what do landlords prefer? The messy, fuzzy, legally-blind gut check. Because it’s easier to defend in small claims court than a risk model that weighed three layers of behavioral indicators trained on a dataset you don’t fully understand.
It’s not just fear of AI. It’s fear of accountability.
The Tech Already Works. The System Doesn’t.
Look at what’s happened in other sectors:
- Banks underwrite millions in mortgages using machine-learning models that access your income and debt flow directly from your bank via APIs.
- Credit card companies evaluate intentions and financial health using behavioral data and clickstream analytics.
- Klarna can approve you for a couch in 0.2 seconds based on API-accessed data abstractions of affordability.
Meanwhile, you’re attaching a JPG of your auto-insurance bill to prove you’re “responsible.”
Rentals are stuck because no one wants to be first. There’s comfort in collective inertia. The standard-form application has effectively become the floppy disk of tenant screening — everyone knows it’s obsolete, but we’re still building the process around it “just in case.”
Who Stands to Win by Blowing It All Up?
Let’s go there: if someone builds “Plaid-for-renters,” the whole application ritual becomes unnecessary overnight.
- A renter opts into a one-click data stream from their bank, employer, utility providers, and rental history platforms.
- An algorithm synthesizes behavioral risk and trustworthiness across patterns, not proxies.
- The landlord decides minimum criteria — a “renter score” of 720+, no bounced utility payments in the last 8 months, etc.
- Ten applicants apply. One-click approvals for seven of them. Instant verification. No forms. Just outcomes.
It’s not utopian. It already exists in pieces: Esusu for rent reporting. Nova Credit for immigrant credit histories. Plaid for permissioned financial streams. Throw in ML enforcement of Fair Housing rules, traceability, and transparency tooling, and the pieces are there.
Who builds the interface is still TBD. But make no mistake: whoever gets there first owns the pipeline between eager renters and landlords tired of PDF ping-pong.
But What About Privacy?
Yes. It’s a tradeoff.
Giving up your spending behavior and real-time financial data sounds like something privacy advocates would (rightly) panic about. But let’s be honest: we already give more personal data to TikTok just to filter our FYP. If the tradeoff is frictionless move-ins and fewer rejections, many renters will opt in.
The real issue here isn’t data collection. It’s data asymmetry. Right now, landlords pretend they don’t already digital-footprint you while hiding behind vague application forms. What if both parties could see the same data-driven score? Consent. Transparency. Mutual visibility.
Not perfect. But definitely better than waving a rent check and hoping someone calls your boss to confirm your last three pay periods.
The Failure of “We’re Exploring AI”
Let’s close with this: In 2024, “we’re exploring AI” is the corporate version of “we just bought a treadmill.” It sounds good. It signals effort. But mostly it just collects dust in the corner while the old process lumbers on.
The rental application is the perfect case study in this facade. The real story isn’t that we don’t have the tech — it’s that the tech threatens the rituals gatekeepers still rely on.
We’re digitizing an inefficient process because it feels safer than rethinking it.
We’re clinging to the form because we don’t trust ourselves with the future.
We’re asking you to list your last five addresses, even though your phone already knows exactly where you’ve lived and when.
So Where Does That Leave Us?
Here’s a thought to take with you:
- If you're in proptech, ask: are we truly innovating the process, or just slapping “AI” on outdated defaults?
- If you're a landlord, ask: what exactly does this form really tell you? Is it filtering for good tenants or just the ones willing to jump through hoops?
- And if you're a renter: imagine a world where you rent an apartment as easily as you pay for coffee. That world doesn’t require more paperwork. It just requires less fear.
We're not missing the data. We’re missing the courage to let go of the form.
Because deep down, even the gatekeepers know: AI won't kill the application.
Irrelevance will.
This article was sparked by an AI debate. Read the original conversation here

Lumman
AI Solutions & Ops