0

I need to convert a SAS file into a flat file. These files can be pretty big that can go up to 60 GB in size. I wrote a script in R (below) but it reads the entire data and then exports to a CSV file. Is there a way I could convert such big files without any memory constraints. I am open to using either R or Python. I working on a machine that has 16 GB RAM.

args = commandArgs(trailingOnly=TRUE)

library(sas7bdat)

MyData <-  read.sas7bdat(file = args[1])
write.csv(MyData, file = args[2], row.names = FALSE)

1 Answers1

1

In my opinion, you can aquire solution using pandas.read_sas and chunksize arg:

Pandas read sas docs

For example, iterate through 10k observations:

import pandas as pd

chunk_size =  10**4
for chunk in pd.read_sas(filename, chunksize=chunksize):
    process(chunk)

where process() are instructions which you want to provide (append, etc.).

s3nh
  • 482
  • 1
  • 8