I have a query that makes use of the pipeline feature of $lookup and it also uses $expr. Well, it works, but the performance is not great. It looks up stuff in a collection with about 4000 docs and joins 2 other collections (using $lookup blocks). It takes about 2000ms to run, even though there are only a few thousand documents in each collection.
The query looks something like this:
{
$match: {
language: 'str'
}
},
{
$lookup: {
from: 'somecollection',
let: { someId: '$someId' },
pipeline: [
{
$match: {
$expr: {
$and: [
{
$eq: [
'$_id',
'$$someId'
]
},
{
$gte: ['$field',value]
},
{
$lte: ['$field2',value]
}
....
// some more conditions..
]
}
}
}
]
Running explain() on this provides info about the first $match block only. But how can figure out whether the $expr in the pipeline is using an index or not?
I tried to add indexes to all fields used in the pipeline, and I also tried to create a compound index, but I couldn't make it any faster.
How could i improve performance?
The structure of my query:
match (filter by language),
lookup (col1 join)
lookup (col2 join)
project,
sort
I'm using Mongo 4.0
EDIT:
Actually, can the ops in the pipeline use an index or not? I've heard that only $eq is able to use indexes, the other ops can't. Is this true???
Thanks in advance!
explain
to see if it does IXSCANs. if it does, then your problem is not with index usage but having to execute too many pipelines per each source document from the first stage. – Septarium