puppeteer newly loaded page after the form submit
By : Niv Bonder
Date : March 29 2020, 07:55 AM
With these it helps I've got it working by waiting for the page to load first. Then I was able to click the correct element. So I've added the following line: code :
await page.waitForNavigation();
await page.click(".show-filters");
|
Looping inside a page.evaluate in Puppeteer
By : wangwenbin
Date : March 29 2020, 07:55 AM
To fix the issue you can do I have a loop inside a page.evaluate method. The loop iterates a query selector which catches an innerText from multiple instances of a text element in a page. , Try: code :
const serpDesc = await page.evaluate(
() => [...document.querySelectorAll(`#rso > div:nth-child(4) > div > div:nth-child(${i}) > div > div > div.s > div > span`)].map(elem => elem.innerText)
);
async function elSelector(i) {
//Where i is the incremented value you pass
await page.evaluate((i) => {
let eval = $('yourSelector').toArray();
$(eval[i]).innerText
}, i)
}
for (i=0; i<9; i++) {
elSelector(i);
}
|
Get title from newly opened page puppeteer
By : niakb01
Date : March 29 2020, 07:55 AM
help you fix your problem According to the Puppeteer Documentation: page.title() returns: < Promise< string>> Returns page's title. code :
page._frameManager._mainFrame.evaluate(() => document.title)
|
puppeteer how to return page.on response values
By : Nejc Kotnik
Date : March 29 2020, 07:55 AM
it helps some times You shouldn't use the await operator before page.on(). The Puppeteer page class extends Node.js's native EventEmitter, which means that whenever you call page.on(), you are setting up an event listener using Node.js's emitter.on(). code :
const example_function = value => {
console.log(value);
};
page.on('response', resp => {
var header = resp.headers();
example_function(header['content-disposition']);
});
|
Puppeteer get information about page loaded - list of files loaded and their sizes
By : HGDevelop
Date : March 29 2020, 07:55 AM
I wish this help you Page assets are not stored on disk, they are held in browser memory and sometimes cached, so it's impossible to know their sizes. What you want to look at is web scraping, which can be done with modules like node-website-scraper or with puppeteer : code :
page.on('response', async (response) => {
const url = new URL(response.url());
let filePath = path.resolve(`./output${url.pathname}`);
if (path.extname(url.pathname).trim() === '') {
filePath = `${filePath}/index.html`;
}
await fs_extra.outputFile(filePath, await response.buffer());
});
|