I have nodejs and puppeteer setup and do a bit of scraping for a project. I would like to improve it. Currently the js script is launched by php passing values to be used in scrape like an ID for example. However, the js script I have now has to log into a site every time I hit a new url. Very inefficient.
I've briefly read about setting up a local proxy/server and using wsendpoints to continuously use chromium without launching and closing every run, by utilizing a few scripts... one to launch and others to do what you need to do. I assume this route would help the situation, to where the chromium stays open and logged in to the site, and I can just pass it urls/values to perform scrapes on. Example of goal... [login to view URL]
That is what I need to achieve. I will be here and available for about 4 hours and I'm hoping to get this done in that timeframe.
I need someone who is very fluent with nodejs, puppeteer and such to assist. Lite coding and guidance needed.
Please only bid if you can start right away and help get this done quickly.