Understanding competition is critical for:
Google Maps is one of the most accurate real-time sources for competitor intelligence because it provides:
This tutorial shows how Actowiz Solutions builds Competitor Insight Engines for restaurants and grocery stores using:
All examples use Python, Selenium, Requests, TextBlob, and Lat-Lng grid crawling.
Install dependencies:
pip install selenium
pip install undetected-chromedriver
pip install pandas
pip install textblob
pip install requests
pip install lxml
pip install beautifulsoup4
Imports:
import undetected_chromedriver as uc
from selenium.webdriver.common.by import By
from selenium.webdriver.common.keys import Keys
from time import sleep
import pandas as pd
from textblob import TextBlob
import re
This pipeline captures:
Example output:
{
"name": "Fresh Market",
"category": "Grocery Store",
"rating": 4.4,
"reviews": 815,
"address": "Brooklyn, NY, USA",
"lat": 40.6231,
"lng": -73.9752,
"services": ["Pickup", "Delivery"],
"sentiment": 0.63
}
Popular competitor categories:
Example:
search = "grocery stores in Manhattan"
url = f"https://www.google.com/maps/search/{search.replace(' ', '+')}"
browser = uc.Chrome()
browser.get(url)
sleep(4)
All results load inside a scrollable container.
panel = browser.find_element(By.CLASS_NAME, "m6QErb")
for _ in range(120):
browser.execute_script("arguments[0].scrollTop = arguments[0].scrollHeight", panel)
sleep(1)
cards = browser.find_elements(By.XPATH, '//div[contains(@class,"Nv2PK")]')
competitors = []
for c in cards:
try: name = c.find_element(By.CLASS_NAME, "qBF1Pd").text
except: name = ""
try: rating = c.find_element(By.CLASS_NAME, "MW4etd").text
except: rating = ""
try: reviews = c.find_element(By.CLASS_NAME, "UY7F9").text
except: reviews = ""
try: category = c.find_element(By.CLASS_NAME, "W4Efsd").text
except: category = ""
try: address = c.find_element(By.CLASS_NAME, "rllt__details").text
except: address = ""
try: url = c.find_element(By.TAG_NAME, "a").get_attribute("href")
except: url = ""
competitors.append({
"name": name,
"rating_raw": rating,
"reviews_raw": reviews,
"category": category,
"address": address,
"maps_url": url
})
Google Maps URLs contain coordinate metadata.
def extract_lat_lng(url):
try:
lat, lng = re.findall(r"@([-0-9\.]+),([-0-9\.]+)", url)[0]
return float(lat), float(lng)
except:
return None, None
def extract_place_id(url):
try:
pid = re.findall(r"!1s([^!]+)!8m", url)[0]
return pid
except:
return None
for comp in competitors:
comp["lat"], comp["lng"] = extract_lat_lng(comp["maps_url"])
comp["place_id"] = extract_place_id(comp["maps_url"])
def clean_rating(r):
nums = re.findall(r"\d+\.?\d*", r)
return float(nums[0]) if nums else None
def clean_reviews(r):
nums = re.findall(r"\d+", r.replace(",", ""))
return int(nums[0]) if nums else None
for comp in competitors:
comp["rating"] = clean_rating(comp["rating_raw"])
comp["reviews"] = clean_reviews(comp["reviews_raw"])
Open each competitor’s page → click reviews.
sentiments = []
for comp in competitors[:25]:
browser.get(comp["maps_url"])
sleep(3)
try:
btn = browser.find_element(By.XPATH, '//button[contains(@aria-label,"reviews")]')
btn.click()
sleep(2)
except:
continue
panel = browser.find_element(By.CLASS_NAME, "m6QErb")
for _ in range(30):
browser.execute_script("arguments[0].scrollTop = arguments[0].scrollHeight", panel)
sleep(1)
cards = browser.find_elements(By.XPATH, '//div[@data-review-id]')
for r in cards:
try: text = r.find_element(By.CLASS_NAME, "wiI7pd").text
except: text = ""
try: rating = r.find_element(By.CLASS_NAME, "fzvQZe").get_attribute("aria-label")
except: rating = ""
sentiments.append({
"competitor": comp["name"],
"text": text,
"rating": rating
})
df_sent = pd.DataFrame(sentiments)
def polarity(t):
return TextBlob(t).sentiment.polarity
df_sent["sentiment"] = df_sent["text"].apply(polarity)
Output sample:
| competitor | sentiment | text |
|---|---|---|
| Fresh Market | 0.62 | Good vegetables and friendly staff |
| Domino’s | -0.35 | Delivery was late |
score = df_sent.groupby("competitor")["sentiment"].mean()
df_comp = pd.DataFrame(competitors)
df_comp["strength_score"] = (
df_comp["rating"] * 0.6 +
(df_comp["reviews"] / df_comp["reviews"].max()) * 0.4
)
Export coordinates:
df_geo = df_comp[["name", "lat", "lng", "rating", "strength_score"]]
df_geo.to_csv("competitor_locations.csv", index=False)
Load into:
This generates:
Use latitude/longitude grid to visualize:
Example grid analysis (pseudocode):
df_geo["grid_lat"] = df_geo["lat"].round(2)
df_geo["grid_lng"] = df_geo["lng"].round(2)
heat = df_geo.groupby(["grid_lat","grid_lng"])["name"].count()
heat.reset_index().sort_values("name").head(50)
This shows areas with the least number of competitors → best places to open a new store.
Use this tutorial if:
Use Actowiz Solutions when:
Actowiz provides:
In this guide, you learned how to:
This is the foundation for hyperlocal expansion planning, market-entry strategy, and store network optimization powered by Actowiz Solutions.
You can also reach us for all your mobile app scraping, data collection, web scraping , and instant data scraper service requirements!
Our web scraping expertise is relied on by 4,000+ global enterprises including Zomato, Tata Consumer, Subway, and Expedia — helping them turn web data into growth.
Watch how businesses like yours are using Actowiz data to drive growth.
From Zomato to Expedia — see why global leaders trust us with their data.
Backed by automation, data volume, and enterprise-grade scale — we help businesses from startups to Fortune 500s extract competitive insights across the USA, UK, UAE, and beyond.
We partner with agencies, system integrators, and technology platforms to deliver end-to-end solutions across the retail and digital shelf ecosystem.
Extract real-time travel mode data via APIs to power smarter AI travel apps with live route updates, transit insights, and seamless trip planning.
How a $50M+ consumer electronics brand used Actowiz MAP monitoring to detect 800+ violations in 30 days, achieving 92% resolution rate and improving retailer satisfaction by 40%.

Track UK Grocery Products Daily Using Automated Data Scraping across Morrisons, Asda, Tesco, Sainsbury’s, Iceland, Co-op, Waitrose, and Ocado for insights.
Whether you're a startup or a Fortune 500 — we have the right plan for your data needs.