Scrape Google Shopping Results
In an environment of cut-throat competition, it would be a bad idea not to include web scraping as a marketing and monitoring strategy to keep check on your competitors. Extracting publicly available data not only provides you with competitive leverage but also empowers you to make astute strategic decisions to expand your foothold in the market.
In this tutorial, we will be scraping Google Shopping Results using Node JS. We will also explore the benefits and solutions to the problems that might occur while gathering data from Google Shopping.

Why Scrape Google Shopping?
Scraping Google Shopping is essential if you want to avail the following benefits:

Price Monitoring – Scrape Google Shopping to monitor the pricing of a particular product from various sources, and compare them to get the cheapest source, which also saves you money.
Product Information – Get detailed information about the products from Google Shopping, and compare their reviews and features with various other products to find the best from them.
Product Availability – You can use Google Shopping data to monitor the availability of a set of products instead of manually checking the product from different sources, which consumes your valuable time.
Let’s start scraping Google Shopping:
In this section, we will be scraping Google Shopping Results. But let us first complete the requirements for this project.
Web Parsing with CSS selectors
Searching for tags in HTML files is not only a difficult thing to do but also a time-consuming process. It is better to use the CSS Selectors Gadget for selecting the perfect tags to make your web scraping journey easier.
This gadget can help you develop the perfect CSS selector for your need. Here is the link to the tutorial, which will teach you to use this gadget for selecting the best CSS selectors according to your needs.
Install Libraries
To scrape Google Shopping Results we need to install some NPM libraries so we can move forward.
So before starting, we have to ensure that we have set up our Node JS project and installed both the packages — Unirest JS and Cheerio JS. You can install both packages from the above link.
Process

We have installed all the things which we will need for our scraper. Now we will hit our target URL using Unirest JS to get our HTML data and then we will parse our extracted HTML data with the help of Cheerio JS.
We will target this URL:
`https://www.google.com/search?q=nike+shoes&tbm=shop&gl=us`
Look at the tbm
parameter and its value(shop
, here). This value shop
will tell Google that we are looking for shopping results.
Open this URL in your browser. Inspect the code. You will see that every organic shopping result is inside this tag .sh-dgr__gr-auto
.

Now, we will search the tags for title, product link, price, rating, reviews, delivery, and source.


The above images are in the pattern of two at the top and one at the bottom.
We have completed our search for tags of organic shopping results. Now, we will search for the tags of ad results.

If you inspect the ad results you will see that all the ad results are inside the tag .sh-np__click-target
. This tag contains all the information about the title, link, price, and source.
2023 Update: Some tags have been changed by Google as they update them from time to time. The below code will give you the latest updated tags.
All the above things make our code look like this:
const unirest = require("unirest");
const cheerio = require("cheerio");
const getShoppingData = () => {
try
{
return unirest
.get("https://www.google.com/search?q=nike+shoes&tbm=shop&gl=us")
.headers({
"User-Agent":
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.54 Safari/537.36",
})
.then((response) => {
let $ = cheerio.load(response.body);
let ads = [];
$(".sh-np__click-target").each((i,el) => {
ads.push({
title: $(el).find(".sh-np__product-title").text(),
link: "https://google.com" + $(el).attr("href"),
source: $(el).find(".sh-np__seller-container").text(),
price: $(el).find(".hn9kf").text(),
delivery: $(el).find(".U6puSd").text(),
})
if($(el).find(".rz2LD").length)
{
let extensions = []
extensions = $(el).find(".rz2LD").text()
ads[i].extensions = extensions
}
})
for (let i = 0; i < ads.length; i++) {
Object.keys(ads[i]).forEach(key => ads[i][key] === "" ? delete ads[i][key] : {});
}
let shopping_results = [];
$(".sh-dgr__gr-auto").each((i,el) => {
shopping_results.push({
title: $(el).find("h3.tAxDx").text(),
link: $(el).find(".zLPF4b .eaGTj a.shntl").attr("href").substring($(el).find("a.shntl").attr("href").indexOf("=")+1),
source: $(el).find(".IuHnof").text().replace(/\.aULzUe\{.*?\}\.aULzUe::after\{.*?\}/ , ''),
price: $(el).find(".XrAfOe .a8Pemb").text(),
rating: $(el).find(".NzUzee .QIrs8").text() ? parseFloat($(el).find(".NzUzee .QIrs8").text()?.split("out")[0]?.trim()) : "",
reviews: $(el).find(".NzUzee .QIrs8").text() ? parseFloat($(el).find(".NzUzee .QIrs8").text()?.split("stars.")[1]?.trim()?.replace(/,/g, "")) : "",
delivery: $(el).find(".vEjMR").text()
})
if($(el).find(".Ib8pOd").length)
{
let extensions = [];
extensions = $(el).find(".Ib8pOd").text();
shopping_results[i].extensions = extensions
}
})
for (let i = 0; i < shopping_results.length; i++) {
Object.keys(shopping_results[i]).forEach(key => shopping_results[i][key] === "" ? delete shopping_results[i][key] : {});
}
console.log(ads)
console.log(shopping_results)
})
}
catch(e)
{
console.log(e)
}
}
getShoppingData();
Run this code in your terminal to get the desired results:

Our result should look like this 👆🏻.
Save the data in a CSV file
Instead of creating a mess of data, we should save the extracted information in a CSV file. We will use an npm library Object-to-CSV to complete this task.
Let us install it.
npm i objects-to-csv
Then import it into your code.
const ObjectsToCsv = require('objects-to-csv');
Then, we are going to use this library to store the scraped Google Shopping Data in the CSV file.
const csv = new ObjectsToCsv(shopping_results)
csv.toDisk('./shopping_data.csv', { append: true })
To ensure that the fresh data is transferred to the appended file, we have passed append:true
as a parameter to the toDisk()
method.
After running this code in your terminal, you will a file named shopping_data.csv
in your root project folder.

With Google Shopping API
If you don’t want to code and maintain the scraper in the long run, then you can try our Google Shopping API to scrape shopping results.

We also offer 100 free requests on the first sign-up.
After registering successfully on Serpdog, embed your API Key in the below code you will be able to scrape Google Shopping Results without any blockage.
const axios = require('axios');
axios.get('https://api.serpdog.io/shopping?api_key=APIKEY&q=shoes&gl=us')
.then(response => {
console.log(response.data);
})
.catch(error => {
console.log(error);
});
Conclusion:
In this tutorial, we learned to scrape Google Shopping Results using Node JS. Feel free to message me anything you need clarification on. Follow me on Twitter. Thanks for reading!
Additional Resources
- How to scrape Google Organic Search Results using Node JS?
- Scrape Google Images Results
- Scrape Google News Results
- Scrape Google Maps Reviews
Frequently Asked Questions
How Do I Get Google Shopping Results?
You Can Get Google Shopping Results By Using Serpdog Google Shopping API Without Any Problem With Proxies And CAPTCHAs. This Data Is One Of The Great Sources For Data Miners For Competitor Price Tracking, Sentimental Analysis, Etc.