Introduction
Web scraping is a powerful technique that allows users to extract data from websites automatically. While Python is commonly used for web scraping, Excel VBA combined with Selenium can also be an effective solution for automating data collection directly into Excel.
This guide will cover:
✔️ Setting up Selenium with Excel VBA
✔️ Writing VBA code to extract data from a website
✔️ Automating web navigation (e.g., clicking buttons, searching)
✔️ Handling dynamic content (JavaScript-rendered pages)
✔️ Error handling for smooth execution
1️⃣ Setting Up Selenium for Excel VBA
📍 Step 1: Download SeleniumBasic
To use Selenium in VBA, we need SeleniumBasic, a VBA wrapper for Selenium WebDriver.
- Download SeleniumBasic from GitHub.
- Install it on your system (Default location: C:\Program Files\SeleniumBasic).
📍 Step 2: Download ChromeDriver
Selenium requires ChromeDriver to automate Google Chrome.
- Visit Chrome for Testing.
- Download the latest ChromeDriver version matching your Chrome browser.
- Extract and place the chromedriver.exe file in the SeleniumBasic folder (C:\Program Files\SeleniumBasic).
📍 Step 3: Add Selenium Library in VBA
- Open Excel and press ALT + F11 to open the VBA Editor.
- Go to Tools > References.
- Find and check Selenium Type Library.
- Click OK to enable Selenium in VBA.
2️⃣ Writing VBA Code for Web Scraping
Now, let’s write VBA code to scrape stock prices from a finance website like Moneycontrol or NSE India.
📍 Step 1: Open a Website and Extract Data
Sub ScrapeStockPrice() Dim driver As New Selenium.WebDriver Dim stockPrice As String ' Set up Chrome WebDriver driver.Start "chrome" driver.Get "https://www.moneycontrol.com/" ' Find the search box and enter stock name driver.FindElementByName("search_str").SendKeys "TCS" driver.FindElementByName("search_str").SendKeys Keys.Enter Application.Wait Now + TimeValue("00:00:03") ' Wait for page to load ' Extract stock price (Modify selector as per website structure) stockPrice = driver.FindElementByXPath("//div[@class='inprice1']").Text ' Display result in Excel Sheets(1).Range("A1").Value = "TCS Stock Price" Sheets(1).Range("B1").Value = stockPrice ' Close browser driver.Quit MsgBox "Stock price fetched successfully!", vbInformation, "Scraping Complete" End Sub
🔹 What This Code Does:
✅ Opens Chrome using Selenium
✅ Navigates to Moneycontrol
✅ Searches for a stock (e.g., TCS)
✅ Extracts the stock price and stores it in Excel
✅ Closes the browser after completion
3️⃣ Automating Website Navigation in VBA
Often, we need to click buttons, fill forms, and navigate multiple pages before scraping data.
📍 Example: Automate Login and Navigate
Sub AutomateLogin() Dim driver As New Selenium.WebDriver driver.Start "chrome" driver.Get "https://example.com/login" ' Enter Username and Password driver.FindElementById("username").SendKeys "your_username" driver.FindElementById("password").SendKeys "your_password" ' Click Login Button driver.FindElementByXPath("//button[text()='Login']").Click Application.Wait Now + TimeValue("00:00:05") ' Wait for login to complete MsgBox "Login successful!", vbInformation, "Automation Complete" driver.Quit End Sub
🔹 What This Code Does:
✅ Navigates to the login page
✅ Fills in username and password
✅ Clicks the login button
✅ Waits for the page to load before proceeding
4️⃣ Handling JavaScript-Rendered Content
Some websites load content dynamically using JavaScript (AJAX). Selenium’s Wait functions help handle these delays.
📍 Example: Waiting for an Element to Load
Dim element As WebElement Set element = driver.FindElementByXPath("//div[@class='stock-price']", timeout:=10) stockPrice = element.Text
🔹 This waits up to 10 seconds for the element to appear before extracting data.
5️⃣ Scraping Multiple Records into Excel
Often, we need to extract multiple data points (e.g., a list of stock prices).
📍 Example: Scraping a Table from a Website
Sub ScrapeStockTable() Dim driver As New Selenium.WebDriver Dim rows As Object, row As Object Dim i As Integer driver.Start "chrome" driver.Get "https://www.example.com/stocks" ' Find all table rows Set rows = driver.FindElementsByXPath("//table[@id='stockTable']/tbody/tr") ' Loop through each row and extract data i = 2 ' Start from row 2 in Excel For Each row In rows Sheets(1).Cells(i, 1).Value = row.FindElementByXPath("./td[1]").Text ' Stock Name Sheets(1).Cells(i, 2).Value = row.FindElementByXPath("./td[2]").Text ' Price Sheets(1).Cells(i, 3).Value = row.FindElementByXPath("./td[3]").Text ' Change % i = i + 1 Next row driver.Quit MsgBox "Stock data extracted!", vbInformation, "Scraping Done" End Sub
🔹 What This Code Does:
✅ Extracts a stock table from a website
✅ Loops through each row to get stock names, prices, and changes
✅ Stores the data in Excel dynamically
6️⃣ Error Handling in Web Scraping VBA
Errors can occur if:
❌ The website structure changes
❌ The internet is slow, and elements don't load
❌ Selenium fails to locate an element
📍 Adding Error Handling
On Error Resume Next Set stockPrice = driver.FindElementByXPath("//div[@class='price']") If Err.Number <> 0 Then MsgBox "Stock price not found!", vbCritical, "Error" driver.Quit Exit Sub End If On Error GoTo 0
🔹 This prevents the script from crashing and handles missing elements gracefully.
Conclusion
By using Excel VBA and Selenium, you can:
✔ Scrape stock prices, product listings, or any web data
✔ Automate login and form submission
✔ Extract dynamic JavaScript-rendered content
✔ Store scraped data directly in Excel
✔ Improve efficiency with error handling
With this guide, you’re now ready to build automated data extraction tools in Excel! 🚀