Allow me to preface that I should be able to provide some proof of concept if needed, I just wanted to throw out an initial finding and see if there were some thoughts. Also, I know it exceeds the typical use case, so please bear with me:
I am working on a command-line .NET program whose sole purpose is to perform an initial pull replication against a configured Sync Gateway that has many megabytes-large attachments and documents. However, about 2650 documents in (after downloading ~541 MB), it starts throwing OutOfMemoryExceptions. Barring any memory leaks, are there built-in caching mechanisms that are possibly behaving too aggressively for our use case and we can tweak? It seems as though it is exacerbated by the fact of it having to be a 32-bit program, and possibly how it is storing strings as UTF-16 (or is it?), effectively halving the heap space for cached objects. Using Windows 7 64-bit host (app compiled to prefer 32-bit though) and CBLite SDK v1.2.1.1 from NuGet.
Not necessarily looking for a solution, but more brainstorming. Any thoughts? Thanks!