I am using Algolia for search purposes and we got a huge pile of records. We want to delete some records and have decided to delete records that are older than X date.
First I was using this
const records = [];
const deleteRecordsBeforeDateAlgolia = (date) => {
let client;
*****
const index = client.initIndex('function_message');
//get the records before the given date
try {
index.search('',{
filters: `time_stamp < ${date}`
}).then(({hits}) => {
if(hits.length > 0) {
for (const hit of hits) {
index.deleteObject(hit.objectID);
records.push(hit.objectID);
}
}
if(hits.length === 0) {
console.log(`Deleted ${records.length} records`);
} else {
deleteRecordsBeforeDateAlgolia(date);
}
});
} catch (err) {
console.error(err);
}
};
but I realized this isnt that optimized + will be very slow when deleting on prod. Can you tell me how I can get huge amounts of data with a filter (timestamp in this case) and then delete all of them?
EDIT
const records = [];
const deleteRecordsBeforeDateAlgolia = (date) => {
let client;
//creds stuff
const index = client.initIndex('function_message');
//get the records before the given date
try {
const search = index.browseObjects({
filters: `time_stamp < ${date}`
}).then(res => {
//IT SHOWS RESPONSE IS UNDEFINED
res.forEach(record => {
records.push(record);
});
console.log(`found ${records.length} records`);
});
} catch (err) {
console.error(err);
}
};
deleteObjects
. btw why not use async await. – Sappheraconst hits = await index.search....
and thenidArray = hits.map...
to return an array ofobjectID
and thenawait deleteObjects(idArray)
– Sappheraindex.browseObjects({ query: '', filters: 'updated_at>1641905859', batch: batch => {......
– Sappheraquery: ''
and in thebatch
funtion you can do the pushing. this example algolia.com/doc/api-reference/api-methods/browse/… – Sapphera