In today's competitive market, businesses seek new ways to gain an edge over their rivals. One of the most effective ways is by analyzing data and extracting valuable insights. Restaurant owners, for instance, can use data on customer preferences and popular menu items to improve their offerings and attract more customers.
Postmates.com is a popular platform allowing users to order food from various restaurants. The website must provide an easy way to access and analyze the menus of all the restaurants listed on the platform. This article will explore how to scrape restaurant menus from Postmates.com using Python and a few libraries. After reading this guide, you will have the knowledge and tools to maximize the value of Postmates for restaurant data.
Web scraping is the process of extracting data from online sources using code. In its basic form, web scraping consists of HTML requests and responses. Web scrapers can collect many data types from a given source, including simple lists of items or more complex data tables. The scraped data can then be analyzed to determine significant trends and insights to help business owners make more informed decisions regarding their operations.
Postmates is an on-demand delivery platform that allows users to order food and other goods from restaurants in their area via an app or website, with delivery handled by local couriers on bikes, on foot, or in cars. Postmates is popular with users and has an impressive number of restaurants on its platform.
We first need to identify the information we want to create an algorithm that can extract the necessary data. The following table lists valuable data points business owners can use to improve their operations.
This data helps determine if a restaurant is offering an opportunity to gain profit by lowering prices. If the cost of a dish indicates the menu is overpriced, customers may not return to that restaurant and may no longer use the platform. A significant customer decline leads to less revenue, and less income leads to a decreased ability to expand your business.
The quality of dishes on Postmates varies, with delicious dishes like steak being much more common than low-quality ones like fries (see screenshot). It's easier for businesses of all sizes and types to compete effectively by improving the quality of their offerings.
Some food products sell better than others, and some dishes only appear on the menu once or twice, as noted by the "Limited Time Offer" label. Some words may be an excellent opportunity to gain more customers, while others should be avoided.
Postmates is known for offering convenient delivery options for customers, including delivery by bicycle and on foot, in addition to standard car delivery. Businesses can use this data to identify opportunities to expand their reach and attract new customers.
After you know the data you want to extract, you can use Python and a few web scraping libraries to collect data from a web page. In the following tutorial, we'll use scrapy and beautiful soup. Both are available in the Anaconda package manager, an easy-to-use application simplifying Python code installation. If you don't have the Anaconda application installed, follow these instructions to install it.
To start, open a terminal window and create a new directory to store your Python code. Then use a pip to install scrapy and beautiful soup. $ mkdir postmates_scraper $ cd postmates_scraper $ pip install scrapy beautifulsoup4 Next, create an empty Python module called scrape_postmates.py where your code will be stored. We'll add our code in a later step.
Our objective is to scrape the restaurant menus from Postmates, so let's create a Scrapy project of Postmates on our machine and import the scrapy library. #From the same directory where you installed the Anaconda package manager (pip): > scrapy start project postmates_scraper -d web > cd postmates_scraper > Python manage.py load data restaurants.json
The output confirms that the project was successfully created and has scraped all the data from the Postmates website.
Before we start scraping, we need to set up our credentials, so the scrapy library can connect to Postmates to scrape their data. The certificates are stored on jsonplaceholder.typicamp.com, a free service allowing users to create CRUD web APIs (create/read/update/delete). Create an account on this service and fill in your information so you can access your credentials via JSON or YAML format.
Create a text file called postmates_creds.py in the same directory where scrape_postmates.py is stored and add the following code: username = 'postmates_username' password = 'postmates_password' APIkey = 'postmates_apiKey'
Sign in to your Postmates account, click "Account Settings," and select "API Keys." You'll then see a screen with your API credentials:
You'll need to go to the "Keys" section to see the key for each environment (Production, Staging, QA). The postmates_apiKey variable in your code should be the <API Key> field value.
Once you have all your credentials, create a new file called env.txt in the same directory as scrape_postmates.py and add the following code: export POSTMATES_USERNAME=<username> export POSTMATES_PASSWORD=<password> export POSTMATES_TOKEN=<API key>
Alternatively, using a Mac or Linux machine, you can add these variables to the .bashrc file. As soon as these environment variables are set, our Python code will use them to connect to Postmates as if it pulls information from it.
We need to import the configuration file to our project, so create a new file called urls_scrape.py (in the same directory as scrape_postmates.py) and add the following code: from scrape_postmates import.
We need to import the configuration file to our project, so create a new file called urls_scrape.py (in the same directory as scrape_postmates.py) and add the following code: from scrape_postmates import.
Next, run the code in Terminal with Python scrapy urls_scrape.py. You should get output similar to the following, showing all restaurants and the available time: Scraped 0 restaurants, 0 available.
In your project, you can use the variable to check how many restaurants you have scraped and how much time they are available:
> scrapy crawl postmates_scraper -o restaurant_times.json
You should see this output: Scraped 0 restaurants, 0 available.
To extract data from Postmates, we can use the BeautifulSoup module in combination with a custom HTML parser called XML. With these two tools, we can easily scrape a web page to get its HTML structure plus some of its attributes.
Create another empty file called parse_postmates.py and add the following code: #from the same directory as scrape_postmates.py import lxml.html as HTML from scrapy.contrib.postmates_scraper.
It is where we use BeautifulSoup to get the page's structure and extract data from it. Let's start with saving money on delivery charges:
> from parse_postmates import.
We'll identify some words to gain more customers, while others are risky, giving us an idea of our competitors' success.
The code extracted in this tutorial is for all restaurants that have "delivery" in their name.
This function returns a dictionary containing all the restaurant data: >>> restaurant_times = restaurant_times.get('delivery') >>> print(restaurant_times) {'serves_alcohol': False, 'delivery_charges': 0.0, 'phone': '', 'checkout_available': True, 'wheelchair_accessible': False, 'cash_only': False, ... }
We can expand this function and get the related restaurant information if we have more than ten restaurants with "delivery" in their name. Now, let's use BeautifulSoup to get the restaurant information:
>>> restaurant_times = restaurant_times.get(restaurant_times.has_key('delivery')).next() >>> print(restaurant_times) {'serves_alcohol': False, 'phone': '', 'checkout_available': True, 'wheelchair_accessible': False}
We can scrape other attributes, such as the address, phone number, and all the necessary information.
You have successfully created and run your web scraper program. You learned how to set up your Postmates username, password, and API key. You also learned how to scrape Postmates' restaurant data. The next part of this series will explore how we can create a restaurant crawler program that scrapes other websites for restaurant information.
The next part of this series will explore how we can create a restaurant crawler program that scrapes other websites for restaurant information.
We will catch you as early as we receive the message
“We were searching for a web scraping partner for our restaurant data scraping requirements. We have chosen Foodspark and it was an amazing experience to work with them. They are complete professionals in their attitude towards data scraping. We would certainly recommend them to others for their food data scraping requirements.”
“Working with Foodspark was a completely exceptional experience for me. Foodspark team is professional, calm, and works well with all my food data scraping requirements. 5 Stars to them for their web data scraping work.
“We had a great time working with Foodspark for our restaurant food data scraping requirements that no other service providers would able to cope with competently. Foodspark is just amazing! They have done their work wonderfully well! Thank You Foodspark!”
“We were searching for a food data scraping service provider and we have found Foodspark! It was a great experience working with this professional company. They are absolutely professionals in their method of doing web scraping. You can surely hire them for all your food data scraping service requirements.”
“We are a food aggregator app and we were searching for a food data aggregator app data scraping service provider that can satisfy our requirements of extracting food data from our competitor’s app. Team Foodspark has worked extremely hard as the task was very difficult. They have provided great results and we have become their permanent client!”
“We are very much impressed with Foodspark for their Food Data Scraping Services. Our requirements were quite unusual and hard to implement but they were equally good to the job and they have worked very hard to offer us the finest results. Thumps Up to Foodspark!”