1

I have a question: I have written this post where I said that I had the problems when I loaded large RDF file in my RDF Triple Store.

How I can load a model in TDB TripleStore

This solutions worked, but today it causes another problems.

My part of code code is:

String file = "C:\\file.nt";
String queryStr = "select * {graph <http://nameFile> { ?s ?p ?o }}";

String directory;
directory = "C:\\tdb";

Dataset dataset = TDBFactory.createDataset(directory);
dataset.begin(ReadWrite.WRITE) ;

Query query = QueryFactory.create(queryStr);

QueryExecution qexec = QueryExecutionFactory.create(query, dataset);
qexec.getContext().set(TDB.symUnionDefaultGraph, true);

ResultSet results = qexec.execSelect();

if (!results.hasNext()) {
    Model model = dataset.getNamedModel("http://nameFile");
    TDBLoader.loadModel(model, file);
}

dataset.commit();
dataset.end();

I get an error

java.lang.OutOfMemoryError: GC overhead limit exceeded

Why?

In the previous code I get a

java.lang.OutOfMemoryError: Java heap space

error instead, but I thought to resolve it. This code runs on Web Service. I'm using Netbeans and Glassfish 4.0.

What am I doing wrong?

I have used also the "-XX:-UseGCOverheadLimit" VM Options, but I don't get effects.

Thanks and sorry.

Community
  • 1
  • 1
Musich87
  • 532
  • 1
  • 10
  • 30
  • 1
    Does [java.lang.OutOfMemoryError: GC overhead limit exceeded](http://stackoverflow.com/q/5839359/1281433) help at all? – Joshua Taylor Sep 18 '14 at 20:17
  • I wonder if the answer to your earlier question, [How I can load a model in TDB TripleStore](http://stackoverflow.com/q/25850288/1281433) doesn't help here? It sounds like you're getting some in-memory models when you don't really want in-memory models. – Joshua Taylor Sep 18 '14 at 20:20
  • What do the lines `Model model = dataset.getNamedModel("http://nameFile"); TDBLoader.loadModel(model, file);` do? – Joshua Taylor Sep 18 '14 at 20:21
  • In this line, I want to load a model in Triple Store. After "dataset.end" I return the model. How I said in the previuos post, when I have a large RDF, I have a problem (heap space or GC overhead). I've noticed that sometimes I have no problem with that solution, others so. Why? – Musich87 Sep 19 '14 at 08:34
  • Does Joshua can help me? I tried all solutions. I have added the different options on "VM options", but I get the same error. My error is on this line: TDBLoader.loadModel(model, file); I would like to load my file (is very big, about 700.000 triples) in a model. How I can do it? I have read also this link http://www.jvmhost.com/articles/what-is-java-lang-outofmemoryerror-gc-overhead-limit-exceeded, but it doesn't produce effects. – Musich87 Sep 20 '14 at 17:50
  • The reason I asked about those lines is that I wonder if you're unintentionally reading something into an in-memory model when you didn't mean to. I *think* that you should be able to get a model into TDB without a high memory overhead, since TDB is disk-based. – Joshua Taylor Sep 20 '14 at 17:58
  • I don't believe that I'm unintentionally reading something into an memory model. I added this code at the beginning, as first operation, but I always get the same problem. I noticed one thing: this code runs on a Web Service and it gives me this problem. If I do run as a normal Java Application, I do not have this problem. This information can be useful for solving the problem? – Musich87 Sep 20 '14 at 18:04
  • Perhaps—maybe the memory limits on the VM aren't the same as the memory limits for the web service? (That wouldn't surprise me, but I don't know much about that realm of Java, so I can't be of much help there.) – Joshua Taylor Sep 20 '14 at 18:06
  • I have added VM Option in my Web Service project and I get the same error. I do not know how to solve this problem. – Musich87 Sep 21 '14 at 06:51

0 Answers0