diff --git a/Querying-in-Chunks-for-Big-Queries.md b/Querying-in-Chunks-for-Big-Queries.md new file mode 100644 index 0000000..b05152b --- /dev/null +++ b/Querying-in-Chunks-for-Big-Queries.md @@ -0,0 +1,35 @@ +## The Query Chunks Feature + +It's very unusual for us to need to load a number of records from the +database that might be too big for fitting in memory, e.g. load all the +users and send them somewhere. But it might happen. + +For these cases, it's best to load chunks of data at a time so +that we can work on a substantial amount of data at a time and never +overload our memory capacity. For this use-case we have a specific +function called `QueryChunks`: + +```golang +err = db.QueryChunks(ctx, ksql.ChunkParser{ + Query: "SELECT * FROM users WHERE type = ?", + Params: []interface{}{usersType}, + ChunkSize: 100, + ForEachChunk: func(users []User) error { + err := sendUsersSomewhere(users) + if err != nil { + // This will abort the QueryChunks loop and return this error + return err + } + return nil + }, +}) +if err != nil { + panic(err.Error()) +} +``` + +Its signature is more complicated than the other two Query\* methods, +thus, it is advisable to always prefer using the other two when possible +reserving this one for the rare use-cases where you are actually +loading big sections of the database into memory. +