What should a data architect recommend in order to get around the time out issue?


NTO need to extract 50 million records from a custom object everyday from its Salesforce org. NTO is facing query timeout issues while extracting these records.

What should a data architect recommend in order to get around the time out issue?
A . Use a custom auto number and formula field and use that to chunk records while extracting data.
B . The REST API to extract data as it automatically chunks records by 200.
C . Use ETL tool for extraction of records.
D . Ask SF support to increase the query timeout value.

Answer: C