Manipulating Cookies

Table of contents

In each scraper(normal, renderless or apiScraper) step, cookies are automatically shared between all page loads (renderless and normal) and http requests.
For example:

await ayakashi.goTo("https://somepage.com");
//...later
const apiResponse = await ayakashi.get("https://somepage.com/api");

or in a renderless scraper:

await ayakashi.load("https://somepage.com");
//...later
const apiResponse = await ayakashi.get("https://somepage.com/api");

The API call above will include all cookies we got from the initial page load.
Can be pretty useful for pages that have some kind of protection on their internal APIs, eg. some session ID which you get if you load the full page or an authentication token after a login.

Ayakashi also includes a set of APIs to manage cookies manually.

Setting cookies

await ayakashi.setCookie({
    key: "myCookie",
    value: "test",
    domain: "somepage.com",
    path: "/",
    expires: "2029-10-25T17:29:23.375Z", //optional
    secure: true, //optional
    httpOnly: true, //optional
    hostOnly: false //optional
});

A setCookies() method is also available to set multiple cookies at once and accepts an array of cookie options.
The automatic sharing described above also applies to cookies set manually.

Getting cookies

const cookie = await ayakashi.getCookie({ //filter object
    key: "myKey" //optional,
    domain: "somepage.com" //optional,
    path: "/test" //optional,
    url: "https://somepage.com" //optional
});

All keys in the filter object are optional. Use them in any combination to get the cookie you want.
A getCookies() method is also available to get multiple cookies that pass the filter at once.

Sharing cookies between multiple pipeline steps

The sharing happening above is only valid inside a single scraper pipeline step.
If you need to share cookies between multiple pipeline steps (and scraper types) you can either enable persistentSession or use the above cookie APIs like this:

//inside a normal scraper
module.exports = async function(ayakashi, input, params) {
    //load a page
    await ayakashi.goTo("https://somepage.com");
    //get all the cookies
    const cookies = await ayakashi.getCookies();
    //return them as the step output
    return {cookies: cookies};
};
//then in an apiScraper
module.exports = async function(ayakashi, input, params) {
    //set the cookies we got from the input
    await ayakashi.setCookies(input.cookies);
    //hit your API
    const apiResponse = await ayakashi.get("https://somepage.com/api");
};