Using automations to turn annoying deals blasts into a lucrative database
TL;DR:
I turn annoying widely marketed deal blasts into actionable deal lists with little to no effort.
Every morning I wake up with a new deal loaded to my deal list. A script reads the latest email in a folder, extracts the property information I need, and saves it to a google sheet deal list. Thus creating a one stop place to review all incoming deal flow. Meaning I don't have to shuffle through my email.
From there I quickly underwrite the deal. My model is set up to automatically export the key metrics to my deal pipeline and share those key metrics with my team via slack.
The only action I’m physically involved in is labeling the email and underwriting the deal.
____
Death, Taxes, and Broker marketing blasts flooding your email.
If you are in the real estate industry you know all about deleting deal marketing emails. If you’re not, then you may be thinking why? A few reasons. (1) these deals are seen by everyone on the planet, (2) they are going through the entire sales process, and (3) you’re going to be participating in a highest bidder auction.
Not the kind of deal you want to spend your precious morning reviewing.
But..
Let’s say you did want to review one of these widely marketed deals.
It will probably take 30 minutes to an hour for the whole process.
You’ll have to find the email. Collect all the files. Store them in one place. Clean up the financial docs so they are usable. All before even beginning to underwrite. Then you underwrite the deal. Write the results down. Maybe share it with your team or copy the results to a new location.
All of this just to find out the deal was overpriced and the seller is non-negotiable because 124 other people are reviewing at the same time.
You give up and move on.
A few hours later you’ve completely forgotten about the deal and gained no true value from that time spent.
You do this a handful of times and you will start to auto-delete these emails too.
But..
It doesn’t have to be a complete waste of time. There’s a lot of value in keeping tabs on these deals.
Every deal that trades in your market is another data point. With enough of these data points you can become an expert on real estate in your market. Who knows? Maybe one of these deals actually is a great deal.
With that being said, reviewing a bad deal has to be worth your time. So how can you both focus on high value tasks, and reap the reward of underwriting every deal you see?
You either:
- Hire an analyst for $40-60K a year
- Do it yourself. Wasting hours on hours of your week
- Automate the process
By using a few automations you can transform these annoying emails into a detailed market database. You will:
- Wake up with a new deal every morning
- Underwrite deals 2x faster
- Become an expert in your market & asset class
- Stop fumbling around with files
How it works
This system uses a blend of in-app automations, zapier integrations and python scripts to run a semi-automated underwriting pipeline.
In just a few minutes you can have an underwritten deal and its key metrics saved into a database with minimal effort.
By simply marking these emails with a gmail label, a script will run which takes the important info from the email body and moves it into a google spreadsheet. From there you can quickly access the info and begin underwriting. Once you're done underwriting, in just one click the key results will be saved to a database along with other deals you have underwritten.
A more detailed breakdown:
1 // Incoming deals
- Deal blast email arrives in your inbox
- You label the email, archiving it to a folder
- Every day at 9:00 AM a bot reads the top email in that folder
- The bot then extracts the important property information, and
- Copies it to a google sheets worksheet
- A slack bot notices this and sends a message as a reminder.
- Now you have a single organized place of incoming deal flow.
By clicking once to label an email, I am extracting and saving the important property info needed to a single spreadsheet.
2 // Underwriting, storing, and updating the team
Now it's time to underwrite these deals.
- I go to the spreadsheet that holds the incoming deals
- Refer to the property info and offering memorandum link that was saved down.
- Rent roll docs are quickly read, cleaned, summarized and exported to a clean excel file.
- Complete my brief 1st pass underwriting. Using this model.
- While underwriting, the model automatically saves the key metrics to an “export” tab
- Using rows, the key metrics from the export tab are automatically moved to my deal pipeline sheet.
- Once the deal pipeline sheet is updated the key metrics are auto sent to my team via slack
When my underwriting is complete, the results are automatically moved to my deal pipeline. Once the deal pipeline sees the new deal, a message with those key results is automatically sent to slack.
To summarize: I click once on an email to label it. The email is auto scrubbed and stored. I manually underwrite the deal. Upon completion of underwriting the deal results are automatically saved to a new list and shared with my team.
The only physical touch points are the one click labeling and underwriting.
Setup
Here’s a peek under the hood.
- Figure out which brokerages are selling deals that fit your criteria & sign up for all of their mailing lists
- Create a label in gmail
- Set up an auto archive to this label or manually label emails from these brokerages
- Use script to parse and save all property info and pertinent links to google sheets.
- Use cron job so that it runs daily (varies depending on how many lists you’re on)
- Use Zapier to send an alert via slack when a new deal arrives in the google sheets
- Quickly u/w the deal using the rent roll script and u/w model
- Find the export tab on the u/w model and copy the url
- Use Rows to quickly import then export the key results of the model to the deal pipeline sheet.
Set this up once and everyday you will wake up to a new deal in your pipeline. In one click you’ll have access to the financial docs. A rent roll parser immediately exports, organizes, and summarizes the rent roll to a new sheet. Use thius straight forward underwriting model that helps give a quick gut check. In one click the key results of the underwriting are exported to a deal pipeline sheet that tracks all underwritten deals.
I keep things very standardized when underwriting these deals. The goal isn’t to write an offer but to build a large picture of the market. Which will arm you with the knowledge and confidence when underwriting better deals. Who knows, you may even come across a standout deal?
Step 1 – Sign up for the email lists of the top brokers in your area & asset class
Figure out who the most 10-15 active brokers are in your area and sign up for their mailing list. Sometimes you can join on their website, sometimes you will have to ask directly.
Focus on activity – who’s trading the most deals.
A few tips:
- Sign up for the big names because these will bring the volume (these are usually not your best deals but good data building)
- Sign up for the niche brokerages. The boutique, single-asset focused brokerages. (in addition to getting on their list, these are the brokers you want to build relationships with)
Step 2 – Setting up Gmail
- Create label
- Set rules for the label
Create a label. I call mine Deal Blast. Everytime you apply this label to a message it will be stored in the corresponding folder.
You can even set up an auto archive (automatically moves emails to a folder) based on some rules you give gmail (i.e. every email from johnsmith@gmail.com is automatically moved into the folder)
I prefer to manually label emails as I see them. Two reasons: (1) the script looks for the latest email in the folder. If emails from several brokers are constantly being moved to the Deal Blast folder my script will miss most of the deal blasts. (2) You will likely have to use the senders’ email to properly auto archive. If you have a separate conversation with one of these senders, those emails will go into the folder as well. Messing up the script.
You can read this guide if you want more details on how to set-up your gmail.
Step 3a – Using the Email Parse Script:
I wrote a script that runs every day at 11 AM. The script reads my inbox, finds a specific folder in the inbox, finds the latest email in that folder, reads the body text, extracts specific text (market, address, units, size, year built, and offering memo link). Sends the extracted text to a specific google sheets worksheet and tab.
Every morning I wake up to a new deal in my deal pipeline worksheet that looks like this.
I simply click the link, download the financials and begin underwriting.
Step 3b – Slack Reminder:
If I’m busy and forget to check the spreadsheet. No problem. Anytime a new deal arrives in the sheet I receive a slack message reminding me it’s there. This is done with a simple zapier integration. The message looks like this:
Step 4 – Rent Roll Script:
A bot that finds the most recently downloaded excel or pdf file. It then cleans, summarizes and exports the rent roll into a clean excel file.
One of the most annoying things about underwriting deals is dealing with financial docs. If you have ever underwritten a deal you have received or downloaded a non-usable PDF or overly formatted excel file. You’re stuck either manually re entering the data, reformatting, or using something like a 3rd-party app like Adobe to export the PDF into excel. Which almost always comes out looking like complete chaos.
For this reason I created a rent roll script that finds the latest rent roll file on my computer. Whether it’s a PDF or Excel file. It then quickly reads, cleans, summarizes and exports the rent roll into a clean excel file.
In just one click. The rent roll goes from:
A raw PDF file with unneeded info and no summary
TO
A clean excel file with unit counts, average unit rents and unit size by unit type.
This decreases my time spent underwriting tremendously!
Step 5 – Underwriting & Rows:
Automatically build a database while you underwrite.
I built a model that is very straightforward and perfect for initial deal gut checks. You can access the model template here. There’s a tab in the model named “Export”. As you underwrite, all of the key deal metrics and descriptors auto populate to this tab.
I use a free no-code tool called Rows, to pull those key metrics from google sheets into my deal pipeline in one click.
Ultimately the goal of this whole process is to make underwriting widely marketed deals worthwhile. We do that by saving all of this data so we can slowly build a database while we underwrite. This loop from my model to rows to the deal pipeline allows me to do this in an effortless way.
Step 6 - Deal Pipeline & Market Research:
The deal pipeline is my number one database when helping/working with GP’s and/or brokers. It holds all deal information. Whether it's a deal being tracked, a deal that needs to be underwritten, a deal that has been underwritten, and deals with offers out.
In this process the deal pipeline contains all the deals that have been underwritten. The key results from the model above get dumped into this list. Over many deals this becomes a fantastic overview of the market.
A pivot table on the first tab summarizes the information. Giving the user a high-level snapshot of valuations and return metrics for these deals.
I go into depth and share the template here [LINK].
Idea for brokers
This system can create a killer newsletter for brokers. Imagine taking all of the listed deals from the market and being able with minimal effort, u/w then summarize them. You could then share the results to an email list. A very effortless way to show your market expertise.
Final thoughts
The purpose is not to automate the entire underwriting process. The idea is to turn a “not worthy of my time” task into a lucrative database in a relatively quick and easy manner.
If you want to learn more about the automations or need assistance on your processes feel free to reach out.