Description
Hey so I've been trying to figure out how to use your library in my project and I ran across a use case that which would require an extension to field.complexity()
. It's a bit hard to explain given a lack of well defined terms, but perhaps I can demonstrate what I'm looking for in an example.
Let's say I have an Album type:
const Album = new GraphQLObject({
name: `Album`,
fields: () => ({
id: {
type: GraphQLID
},
totalPhotos: {
type: GraphQLInt // Let's say we have 100 Photos in the Album
},
photos: {
type: new GraphQLList(Photo),
args: {
count: { type: GraphQLInt } // And we pass a Count of 150 in our Query
},
// It would therefor follow that if we used just args.Count alone, our complexity would be inflated
// The calculated complexity of this node should be capped at 100
complexity: (args, childComplexity, parent) =>
childComplexity * (args.count < parent.totalPhotos ? args.count : parent.totalPhotos),
// Lets assume our resolver will return an array of Photos of size Count, or all the Photos in the Album
resolve: (parent, args, info) => fetchAlbumPhotos(parent.id, args.count)
}
})
})
To make this possible a change would need to be made to field.complexity()
, ie: the addition of a third argument parent
, similar to resolve
as used in the above example. parent
would be an object containing all the current values of the Type.
I hope my code comments make clear how complexity calculation could be inaccurate in a case such as this. If we look at maximumComplexity
as a budget for our queries, then we don't want to over-pay for the cost of execution.
Let me know if you need further clarification!