Simple ( maybe ) question - RAM usage on a data agent is configurable, but what is the data agent doing to require RAM?
I had assumed it was just a passthrough so wouldn't need to store or process any data locally, but is that the case?
I've also assumed the "chunking" property in a dataset is processed on the source DB, but maybe the data agent plays a role there - parsing out the result set?
In any case, RAM is fairly abundant, so I'm willing to allocate more to the DA, but I sure would like to know why before I do it.
@JoeM - we need more labels ( or make them not mandatory ) I'm kind of just rotating through the non-version labels when the existing ones don't seem to apply.
Solved! Go to Solution.
The Data Agent (DA) uses memory to buffer data fetched from data sources and to compress and encrypt data before transmitting it to Incorta. The DA retrieves data from data sources in batches, which is why it's not necessary to increase its memory allocation based on the size of the data being extracted.
However, the DA's memory requirements or utilization increases as the number of concurrently extracted tables increases or the chunking is enabled.
The default memory allocation for the DA is 2 GB, but it's recommended to allocate 4 GB.