LHCI Lighthouse disconnect Google authentication done with Puppeteer - puppeteer

I try to run lhci on a website accessible after a Google authentication.
I use Puppeteer to authenticate and it works well but after Puppeteer steps lhci opens a new browser and I can see I am disconnected (I am redirected to Google login page).
The steps I can see are the followings :
-Puppeteer opens a browser
-Puppeteer opens a new page
-Puppeteer logs in with Google account
-I am connected on my website
-LHCI opens a new browser
-I am redirected to Google login page on the new browser
-LHCI tests the google login page performances instead of my website...
lighthouserc.js :
module.exports = {
ci: {
collect: {
headful: true,
disableStorageReset: true,
puppeteerScript: './puppeteerScript.js',
puppeteerLaunchOptions: {
slowMo: 20,
headless: false,
disableStorageReset: true,
},
settings: {
disableStorageReset: true,
preset: 'desktop',
'throttling-method': 'provided',
onlyCategories: ['performance', 'accessibility', 'seo'],
},
numberOfRuns: 1,
url: UrlsTab,
},
assert: {
assertions: {
'categories:performance': ['error', { minScore: 0.9 }],
'categories:accessibility': ['error', { minScore: 0.9 }],
'categories:seo': ['error', { minScore: 0.9 }],
},
},
upload: {
target: 'temporary-public-storage',
},
},
};
puppeteerScript.js
/**
* #param {puppeteer.Browser} browser
* #param {{url: string, options: LHCI.CollectCommand.Options}} context
*/
const puppeteer = require('puppeteer');
async function doGoogleLogin(loginUrl, page, email, password) {
const navigationPromise = page.waitForNavigation();
await page.goto(loginUrl);
await navigationPromise;
await page.waitForSelector('input[type="email"]');
await page.click('input[type="email"]');
await navigationPromise;
await page.type('input[type="email"]', email);
await page.waitForSelector('#identifierNext');
await page.click('#identifierNext');
await page.waitFor(500);
await page.waitForSelector('input[type="password"]');
await page.waitFor(500);
await page.type('input[type="password"]', password);
await page.waitForSelector('#passwordNext');
await page.click('#passwordNext');
await navigationPromise;
await page.waitFor(1000);
}
async function setup(browser, context) {
browser = await puppeteer.launch({ headless: false, disableStorageReset: true });
const page = await browser.newPage();
await page.setCacheEnabled(true);
await doGoogleLogin(context.url, page, googleEmail, googlePassword);
}
module.exports = setup;
I tried disableStorageReset: true but it is not sufficient to preserve the connection. Do you have an idea of something I could try ?

Related

cookie that received from server doesn't stored on browser

i try to create an app using react and nodejs
i'm using express-session and express-mysql-session to store sessions,
It's store session on mysql but note stored on browser cookies. knowing that the response cookies including the session that created from server, knowing also that work just fine on development, but after deploy the application on render i found this problem.
server.js :
const express = require("express");
const app = express();
const cors = require("cors");
const config = require("./config/config");
const mode = process.env.NODE_ENV;
app.use(express.json());
const session = require("express-session");
const sqlSessionStor = require("express-mysql-session")(session);
const dbInfo = config[mode];
const options = {
...dbInfo,
schema: {
tableName: "sessions",
columnNames: {
session_id: "session_id",
expires: "expires",
data: "data",
},
},
};
const sessionStor = new sqlSessionStor(options);
app.use(
session({
name: "auth",
key: "auth",
resave: false,
saveUninitialized: false,
secret: "strongSecretKey",
store: sessionStor,
cookie: {
maxAge: 1000 * 60 * 60 * 24,
},
}),
);
const clientUrl = process.env.CLIENT_URL;
app.use(
cors({
origin: clientUrl,
credentials: true,
}),
);
on login.js file:
exports.login = (req, res) => {
authModel
.login(req.body)
.then((result) => {
req.session.isUser = result.user;
res.send(result);
})
.catch((err) => {
res.send(err);
});
};
on client (react):
async function login() {
const options = {
headers: { "Content-Type": "application/json" },
withCredentials: true,
};
const req = await axios.post(`${VITE_API_KEY}/login`, userInfo, options);
const user = await req.data;
if (user.login) {
//
} else {
//
}
}
response cookies:
screenshot
cookies:
screenshot
this is some solutions that not helpful to me:
i insert httpOnly to true and secure: true and the server does not send any response cookies, that way i'm not inserting this potions on code above.
i try to use sameSite with all values

Puppeteer Set Request Cookie Header

I'm trying to set the cookies on each request via Puppeteer request interception. I've noticed that while setting headers['sample-header']=1 creates header 'sample-header' equal to 1, setting headers['cookie'] = x... does not set the requests cookie. For instance, the following code does not set any requests cookies.
const browser = await puppeteer.launch({headless: false,executablePath:dirnae});
const page = await browser.newPage();
const client = await page.target().createCDPSession();
await page.setRequestInterception(true);
page.on('request', request => {
const headers = request.headers();
headers['cookie'] = 1;
request.continue({ headers });
});
page.on('request', request => {
console.log(request.headers())
});
page.on('response', response => {
//console.log(response.headers()['set-cookie'])
});
await page.goto('https://google.com');
EDIT: I figured out that I can see the requests cookie header thru handling of the Network.requestWillBeSentExtraInfo event.
However, I can't seem to edit requests in that event.
You cannot change the cookie per network request. You can use the page.setCookie and provide a cookie for different url or domain. Below is the code for reference:
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch();
const page = await browser.newPage();
var cookies = [
{
"name": "sample-cookie1",
"value": "1",
"domain": "stackoverflow.com"
},
{
"name": "sample-cookie2",
"value": "2",
"domain": "pptr.dev"
}
];
await page.setCookie(...cookies);
await page.goto("https://pptr.dev");
console.log(await page.cookies()); //this will have the **sample-cookie2** cookie
await page.goto("https://stackoverflow.com");
console.log(await page.cookies()); //this will have the **sample-cookie1** cookie
})();

How to remove SSL certificate check/error with puppeteer in headless mode.?

Tried below code but getting an error.
Error: net::ERR_SSL_VERSION_OR_CIPHER_MISMATCH at
https://www.xxxxxxsolutions.com/
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch({ignoreHTTPSErrors: true, acceptInsecureCerts: true, args: ['--proxy-bypass-list=*', '--disable-gpu', '--disable-dev-shm-usage', '--disable-setuid-sandbox', '--no-first-run', '--no-sandbox', '--no-zygote', '--single-process', '--ignore-certificate-errors', '--ignore-certificate-errors-spki-list', '--enable-features=NetworkService']});
const page = await browser.newPage();
try {
await page.goto('https://www.xxxxxxxsolutions.com/', {waitUntil: 'networkidle2', timeout: 59000});
const cookies = await page._client.send('Network.getAllCookies');
JSON.stringify(cookies, null, 4);
} catch (e) {
console.log(e);
}
await browser.close();
})();
#mujuonly, this is version related issue. Please try the same code above 1.16.0 or latest version 2.0. It's working fine.

Puppeteer-extra allow flash support

I need puppeteer (not in headless mode) to open a page and have flash enabled from the get go.
Meaning no manual downloading or clicking to run flash.
So far i've added puppeteer-extra and its flash plugin as was used in a prior question:
Allowing to run Flash on all sites in Puppeteer
My chrome version is 75.0.3770.142 and my puppeteer dependencies are:
* "puppeteer": "^1.19.0",
* "puppeteer-core": "^1.19.0",
* "puppeteer-extra": "^2.1.3",
* "puppeteer-extra-plugin-flash": "^2.1.3",
* "puppeteer-extra-plugin-user-data-dir": "^2.1.2",
* "puppeteer-extra-plugin-user-preferences": "^2.1.2",
import puppeteer from 'puppeteer';
import PuppeteerCore from 'puppeteer-core';
import PuppeteerExtra from 'puppeteer-extra';
import PuppeteerFlash from 'puppeteer-extra-plugin-flash';
PuppeteerExtra.use(PuppeteerFlash());
(async () => {
const browser = await PuppeteerExtra.launch({
headless: false,
executablePath: '/Applications/Google Chrome.app/Contents/MacOS/Google\ Chrome',
args: [
'--window-size=800,600',
'--enable-webgl',
'--enable-accelerated-2d-canvas',
],
});
const page = await browser.newPage();
await page.setViewport({ width: 800, height: 600 });
await page.goto('http://ultrasounds.com', { waitUntil: 'networkidle2' });
})();
I expected the above code to open the page, download the necessary flash and run the flash content when done.
As it is though, it does the download but still requires a user to click enable flash to make the content run.
I'm wondering if anyone could please let me know if I'm doing anything wrong in the above code, if I've misunderstood something or otherwise?
if you use the localPath chrome app, you needn't the puppeteer-extra-plugin-flash.
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch({
executablePath: '/Applications/Google Chrome.app/Contents/MacOS/Google Chrome',
ignoreHTTPSErrors: true,
headless: false,
});
const page = await browser.newPage();
await page.goto('https://v.qq.com/iframe/preview.html?width=500&height=375&auto=0&vid=a30198lw6j2');
const dimensions = await page.evaluate(() => {
return {
src: document.getElementById('tenvideo_video_player_0').getAttribute('src'),
};
});
console.log('Dimensions:', dimensions);
await browser.close();
})();

How to use Proxy with puppeteer

I'm new to puppeteer and node, trying to use a proxy with puppeteer in order to collect requests & responses, hopefully also websocket communication, but so far couldn't get anything to work..
I'm trying the following code:
const puppeteer = require('puppeteer');
const httpProxy = require('http-proxy');
const url = require('url');
let runProxy = async ()=> {
// raise a proxy and start collecting req.url/response.statusCode
};
let run = async () => {
await runProxy();
const browser = await puppeteer.launch({
headless: false,
args: ['--start-fullscreen',
'--proxy-server=localhost:8096']
});
page = await browser.newPage();
await page.setViewport({ width: 1920, height: 1080 });
await page.goto('http://www.google.com',
{waitUntil: 'networkidle2', timeout: 120000});
};
run();
I've tried some variation from https://github.com/nodejitsu/node-http-proxy but nothing seems to work for me, some guidance is at need, thanks
try this, use https-proxy-agent or http-proxy-agent to proxy request for per page:
import {Job, Launcher, OnStart, PuppeteerUtil, PuppeteerWorkerFactory} from "../..";
import {Page} from "puppeteer";
class TestTask {
#OnStart({
urls: [
"https://www.google.com",
"https://www.baidu.com",
"https://www.bilibili.com",
],
workerFactory: PuppeteerWorkerFactory
})
async onStart(page: Page, job: Job) {
await PuppeteerUtil.defaultViewPort(page);
await PuppeteerUtil.useProxy(page, "http://127.0.0.1:2007");
await page.goto(job.url);
console.log(await page.evaluate(() => document.title));
}
}
#Launcher({
workplace: __dirname + "/workplace",
tasks: [
TestTask
],
workerFactorys: [
new PuppeteerWorkerFactory({
headless: false,
devtools: true
})
]
})
class App {}