How to Get All Available Links on the Page using Selenium in Java?

Selenium is an open-source Web-Automation tool that is used to automate web Browser Testing. The major advantage of using selenium is, that it supports all major web browsers and works on all major Operating Systems, and it supports writing scripts on various languages such as Java,  JavaScript, C# and Python, etc. While automating a webpage, we are required to fetch and check all the available links present on the webpage, In this article, we will learn to get all the available links present on a page using “TagName”.

As we know all the links are of type anchor tag “a” in HTML. For Example,

<a href=”w3wiki.net”>w3wiki</a>

Table of Content

  • How to fetch all the links on a webpage?
  • Sample Code Example to Scrape Links

How to fetch all the links on a webpage?

  • Navigate to the webpage.
  • Get the list of WebElements with the TagName “a”.
  • List<WebElement> links=driver.findElements(By.tagName(“a”));
  • Iterate through the List of WebElements.
  • Print the link text.

Sample Code Example to Scrape Links

In this example, we are navigating to the URL “https://www.w3wiki.net/” and print the link text of all available links on the page.

Java
public class Beginner {

    WebDriverManager.chromedriver().setup();
    WebDriver driver = new ChromeDriver();
    driver.manage().window().maximize();
    driver.get("https://www.w3wiki.net/");

    // Get all the available Links
    List<WebElement> links
        = driver.findElements(By.tagName("a"));

    // Iterating through all the Links and printing link
    // text
    for (WebElement link : links) {
        System.out.println(link.getText());
    }

    driver.close();
}

Output:

This program will get all the Links in the List of WebElements and print all the link texts.


Contact Us