Python parallel execution with selenium
Asked Answered
S

2

17

I'm confused about parallel execution in python using selenium. There seems to be a few ways to go about it, but some seem out of date.

  1. There's a python module called python-wd-parallel which seems to have some functionality to do this, but it's from 2013, is this still useful now? I also found this example.

  2. There's concurrent.futures, this seems a lot newer, but not so easy to implement. Anyone have a working example with parallel execution in selenium?

  3. There's also using just threads and executors to get the job done, but I feel this will be slower, because it's not using all the cores and is still running in serial formation.

What is the latest way to do parallel execution using selenium?

Shawm answered 11/3, 2017 at 8:12 Comment(2)
About the item 1. There are many of companies out there that offer solutions for paralalel testing. Saucelabs is one of those. But there are many more listed here on selenium grid page. Selenium grid is also an alternative for not pure python parallelism.Dubois
Just to complete those companies are listed as Selenium Level Sponsors.Dubois
F
12

Use joblib's Parallel module to do that, its a great library for parallel execution.

Lets say we have a list of urls named urls and we want to take a screenshot of each one in parallel

First lets import the necessary libraries

from selenium import webdriver
from joblib import Parallel, delayed

Now lets define a function that takes a screenshot as base64

def take_screenshot(url):
    phantom = webdriver.PhantomJS('/path/to/phantomjs')
    phantom.get(url)
    screenshot = phantom.get_screenshot_as_base64()
    phantom.close()

    return screenshot

Now to execute that in parallel what you would do is

screenshots = Parallel(n_jobs=-1)(delayed(take_screenshot)(url) for url in urls)

When this line will finish executing, you will have in screenshots all of the data from all of the processes that ran.

Explanation about Parallel

  • Parallel(n_jobs=-1) means use all of the resources you can
  • delayed(function)(input) is joblib's way of creating the input for the function you are trying to run on parallel

More information can be found on the joblib docs

Filemon answered 10/5, 2017 at 13:19 Comment(7)
Is there any straight-forward way to re-use one webdriver.PhantomJS instance for one n_jobs, rather than closing and opening for every iteration?Tarragon
Why would you want to do that? It seems like a bad idea to try to access one webdriver instance from multiple processes - I believe that will hurt parallelization. Anyway if you decide to go on with that, you will have to make the webdriver serializeableFilemon
many thanks. my rational was to have one driver instance for each process (not one driver instance for more than one process) since in the list "how to speed up selenium" the line item "reuse driver instances" is pretty much on topTarragon
You are right, in the code that what is happens. take_screenshot is basically what happens on every process, you can do what you want in that function. So you basically create a webdriver instance for each process, do what you will in that process and close it before the process is doneFilemon
To not recreate instances, I would chop the urls list to even sized sublists, and then send them to the processes, that way the spawning of the processes (and the creation of the webdriver instance) will happen only once per processFilemon
thanks for your advise - that's exactly what I wanted to know - do you think that this approach has any potential in significantly speeding up scraping large URL pools (educated guess)? I am new to selenium and am glad for every input at this stageTarragon
As I see it it depends. It is all a function of the overhead of spawning processes and transferring data between them against the benefits of parallelization - But mostly, if the operations are not really short, you would benefit from using a parallel implementationFilemon
D
8
  1. Python Parallel Wd seams to be dead from its github (last commit 9 years ago). Also it implements an obsolete protocol for selenium. Finally code is proprietary saucelabs.

Generally it's better to use SeleniumBase a Python test framework based on selenium and pytest. It's very complete and supports everything for a performance boost, parallel threads and so much more. If that's not your case ... keep reading.

Selenium Performance Boost (concurrent.futures)

Short Answer

  • Both threads and processes will give you a considerable speed up on your selenium code.

Short examples are given bellow. The selenium work is done by selenium_title function that return the page title. That don't deal with exceptions happening during each thread/process execution. For that look Long Answer - Dealing with exceptions.

  1. Pool of thread workers concurrent.futures.ThreadPoolExecutor.
from selenium import webdriver  
from concurrent import futures

def selenium_title(url):  
  wdriver = webdriver.Chrome() # chrome webdriver
  wdriver.get(url)  
  title = wdriver.title  
  wdriver.quit()
  return title

links = ["https://www.amazon.com", "https://www.google.com"]

with futures.ThreadPoolExecutor() as executor: # default/optimized number of threads
  titles = list(executor.map(selenium_title, links))
  1. Pool of processes workers concurrent.futures.ProcessPoolExecutor. Just need to replace ThreadPoolExecuter by ProcessPoolExecutor in the code above. They are both derived from the Executor base class. Also you must protect the main, like below.
if __name__ == '__main__':
 with futures.ProcessPoolExecutor() as executor: # default/optimized number of processes
   titles = list(executor.map(selenium_title, links))

Long Answer

Why Threads with Python GIL works?

Even tough Python has limitations on threads due the Python GIL and even though threads will be context switched. Performance gain will come due to implementation details of Selenium. Selenium works by sending commands like POST, GET (HTTP requests). Those are sent to the browser driver server. Consequently you might already know I/O bound tasks (HTTP requests) releases the GIL, so the performance gain.

Dealing with exceptions

We can make small modifications on the example above to deal with Exceptions on the threads spawned. Instead of using executor.map we use executor.submit. That will return the title wrapped on Future instances.

To access the returned title we can use future_titles[index].result where index size len(links), or simple use a for like bellow.

with futures.ThreadPoolExecutor() as executor:
  future_titles = [ executor.submit(selenium_title, link) for link in links ]
  for future_title, link in zip(future_titles, links): 
    try:        
      title = future_title.result() # can use `timeout` to wait max seconds for each thread               
    except Exception as exc: # this thread migh have had an exception
      print('url {:0} generated an exception: {:1}'.format(link, exc))

Note that besides iterating over future_titles we iterate over links so in case an Exception in some thread we know which url(link) was responsible for that.

The futures.Future class are cool because they give you control on the results received from each thread. Like if it completed correctly or there was an exception and others, more about here.

Also important to mention is that futures.as_completed is better if you don´t care which order the threads return items. But since the syntax to control exceptions with that is a little ugly I omitted it here.

Performance gain and Threads

First why I've been always using threads for speeding up my selenium code:

  • On I/O bound tasks my experience with selenium shows that there's minimal or no diference between using a pool of Processes (Process) or Threads (Threads). Here also reach similar conclusions about Python threads vs processes on I/O bound tasks.
  • We also know that processes use their own memory space. That means more memory consumption. Also processes are a little slower to be spawned than threads.
Dubois answered 19/9, 2021 at 23:30 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.