Selenium with Java to scrape data from website



  • I am trying to scrape data from a site I did code for. I want to scrape data on button click event but when I run my program it throws an exception

    Exception is: java.lang.NoClassDefFoundError: com/google/common/base/Function

    How can I remove this exception and work my program

    Here is my code which I tried

    import org.openqa.selenium.By;
    import org.openqa.selenium.WebDriver;
    import org.openqa.selenium.WebElement;
    import org.openqa.selenium.firefox.FirefoxDriver;
    import org.openqa.selenium.support.ui.Select;
    
    public class GetData {
    
        public static void main(String args[]) throws InterruptedException {
            WebDriver driver = new FirefoxDriver();
            driver.get("http://www.upmandiparishad.in/commodityWiseAll.aspx");
            Thread.sleep(5000);
            // select barge
            new Select(driver.findElement(By.id("ctl00_ContentPlaceHolder1_ddl_commodity"))).selectByVisibleText("Jo");
            // click button
            Thread.sleep(3000);
            driver.findElement(By.id("ctl00_ContentPlaceHolder1_btn_show")).click();
            Thread.sleep(5000);
    
            //get only table tex
            WebElement findElement = driver.findElement(By.className("grid-view"));
            String htmlTableText = findElement.getText();
            // do whatever you want now, This is raw table values.
            System.out.println(htmlTableText);
    
            driver.close();
            driver.quit();
    
        }
    }
    


  • This error means that you have forgotten to include a dependency in your project. It's been suggested on stackoverflow that the dependency you forgot might well be selenium-server-standalone-version.jar. Does adding that jar to your project help?



Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2