I'm trying to import a large Database table into R to do some global analysis. I connect to Oracle DB with ROracle and use dbGetquery. Make minimum selection and necessary where clauses directly in the query to reduce the scope of the dataset but still it is 40 columns for 12 million of rows.
My PC has only 8GB of RAM how can I handle this? There is no way to store those data on the disk rather than on the RAM ? or something similar to that way? The same things made in SAS works fine.
Any Idea?