My website has a very granular permissions system for every client, making it fairly large in size.
To keep a cap on database queries I have been loading bitmasks from a mysql database at the start of a users sessions and then saving it as session data, so it looked something like this. This has allowed me to one make one (albeit it complex JOIN query) query per user session without creating a huge session file.
"permissions" => array(
"type 1" => 'bitfield'
"type 2" => 'bitfield'
"type 3" => array(
entity id = 'bitfield'
entity id = 'bitfield')
"type 4" => array(
entity id = 'bitfield'
entity id = 'bitfield')
)
Permissions are entirely group based, so every person in a given group would have this replicated in their session data.
However bitmasks are starting to be a pain to use and I'm looking to move to using an acl. The reason I didn't use an acl in the first place however was to minimise database usage..
So.. now I am going to have an entirely database/cache driven acl without any bitmasks. However storing huge arrays of permissions in user session data doesn't seem ideal. (do you agree?)
I think the way to go is to use a flat file cache to store groups permissions. Would the easiest way to do this be a file per group? Would this change when there are 4,000 + groups each with 4 permission types (2 permission types are global with combined 40 or so permissions, 2 types are local permissions with combined 40 or so permissions per entity (each type has maybe 3 or 4 lots of 20 permission!). Edit: for clarity this means 160 - 200 permission entries per group
This seems like it would be a fairly huge cache! Would it be best to just have huge database usage on every page load? This kind of data size made bitmasks far easier but they are just simply not flexible enough anymore.
This is made harder by the fact files are served by 2 different servers (session stickied so saving bitfields to session data wasn't a problem), so any cache would have to be synced between the servers. The db is on a seperate server connected by private network with supposedly 1gig connection.
Can any solutions be suggested? I think quick access cache such as memcached with this much data will just blow my memory usage out of the water? I am tempted to just user lots of database queries but I think that may put too heavy a strain on the db server.
Fairly large question, I hope its clear. If any of it needs clarification let me know. Any solutions will be greatly appreciated!
Chris