Login | Register
My pages Projects Community openCollabNet

Project home

If you were registered and logged in, you could join this project.

Summary ORM for Apache Lucene
Category libraries
License Apache License
Owner(s) wettin


Limax is an object orientation augmentation library for Apache Lucene, bringing features similar to those of an object relation mapper (ORM) such as Hibernate. The consumer API is inspried by the Sleepycat BDB implementation.

  • Maps root or static inner classes and interfaces to UML class model using Java 1.5 annotations
  • Lazy loading transactional object persistency
  • Pluggable attribute tokenization with synchronized denormalization

The project is fully functional, however still at a beta stage and not recommended for production. There are currently no benchmarks available. Hint


Originally I wrote this library as I found it messy to synchronize denormalized tokenized attributes in documents when the denormalized data was updated. Coming from Prevayler and BDB for prototyping software, I was also looking for a more Lucene-focused persistency solution that required less time developing and less resources spent running by reducing the multiple data layers to a single one.


UML class model semantics

If you are unfamiliar with the terms and semantics used in UML, this is what you need to learn first. It's pretty simple, and the SVN contains lots of test cases (read: examples). There are many tutorials on the internet, and when I get time I'll write one that fits this project.

You are welcome to join the users-help mailling list and ask any question.

The Java class model, as most other object oriented programming language meta models, see no difference between attributes and associations. They are all fields. In UML, an association can have ownerships, qualifications, multiplicity, contain association classes, et c. The constraints and semantics bound to these elements is the meta data one annotate the java fields (or bean methods in abstract classes and interfaces) with. Limax ensures the integrity.

Downloading the library

I recommend you use Maven. Append the following to your POM:



Example implementation

This demonstrates object persistency and tokenization. Not denormalization, association ownership, qualifications, et c.

Bean annotation

public class Human {

  private Long oid;

  private String name;

  @DateDiscretized(year = true, month = true, day = true)
  private Date born = new Date();

  @BinaryAssociationEnd(otherEndClass = Human.class, otherEndName = "parents", multiplicity = "0..*")
  private List<Human> children = new ArrayList<Human>();

  @BinaryAssociationEnd(otherEndClass = Human.class, otherEndName = "children", multiplicity = "2")
  private List<Human> parents = new ArrayList<Human>(2);

  /** A parameterless constructor is required by Limax */
  public Human() {

  public Human(String name, Date born) {
    this.name = name;
    this.born = born;

  // getters and setters..

Transactional API

public class TestLimax extends TestCase {

  public void test() throws Exception {

    EntityStore store = new EntityStore(new RAMDirectory());

    // all access to the store goes via a transaction.
    // it contains a temporary lucene directory that is merged to the store at commit time.
    Transaction txn = store.newTransaction();

    Human me = new Human("My Name", new Date(1975, 07, 30));

    Human mom = new Human("Mamma Name", new Date(1942, 03, 10));

    Human dad = new Human("Pappa Name", new Date(1942, 03, 17));

    // it is enough to add one instance in the graph,
    // all associations will be coupled automagically.
    store.put(txn, mom);


    // by flushing the transaction we add instances to the transaction index.
    // it makes it available in the searcher of this transaction, but not anywhere else until commited.

    Query q = new TermQuery(new Term("@name/standard", "pappa"));

    assertEquals(0, txn.getCombinedSearcher().search(q).length());


    assertEquals(1, txn.getCombinedSearcher().search(q).length());

    Transaction tmpTxn = store.newTransaction();
    assertEquals(0, tmpTxn.getCombinedSearcher().search(q).length());

    txn.close(); // if you don't close the transaction you will start loosing memory.

    // rather than closing we could use txn.reuse().
    // it will re-use the instances loaded from persisteny
    // so they don't have to be loaded on request.

    // i belive the lazy loading pattern is very similar to the one used by hibernate:
    // beans are extended in runtime with a transaction and get/setter code
    // that load the attributes when the first getter is called.
    // associated instances will be created, but not loaded
    // to avoid loading the whole graph.

    txn = store.newTransaction();

    me = store.get(txn, me.getOid());

    // associates are lazyloaded even if you don't call getLazy().
    // using get() will load all attributes. not the associations.


    // this is probably best demonstrated
    // by setting a breakpoint in you debuger
    // and inspect the java fields before and after you read a value.

    // suppose the fields are public to demonstrate in a test case:

    assertTrue(me.parents.size() == 0);
    // as we called the getter, stuff as been loaded.
    assertTrue(me.parents.size() == 1);


    // this is a good time to inspect the index using luke.

    // i need to figure out some smart way to use the tokenization strategies
    // to form a query pipeline. i've started the work, but it needs a lot more.


Filter this list
Name Summary
limax-demo caveat emptor, demonstrational limax implementation