6

Folks, I'm a developer of a social game and there are already 700k players in the game, and about 7k new players are registered every day, about 5k players are constantly online.

The DB server is running on a pretty powerful hardware: 16 cores CPU, 24 Gb RAM, RAID-10 with BBU built on 4 SAS disks. I'm using Percona server(patched MySQL-5.1) and currently InnoDB buffer pool is 18Gb(although according to innotop only a few free buffers available). The DB server is performing pretty well(2k QPS, iostat %util is 10-15%, almost always 0 processes in "b" state in vmstat, loadavg is 5-6). However from time to time(every few minutes) I'm getting about 10-100 slow queries(where each may last about 5-6 seconds).

There is one big InnoDB table in the MySQL database which occupies the most space. It has about 300 millions rows, it's size is about 20 Gb. Of course, this table is gradually growing... I'm starting to worry it's affecting the overall performance of the database in a negative way. In the nearest future I'll have to do something about it, but I'm not sure what exactly.

Basically question boils down to whether to shard or simply add more RAM. The latter is simpler, of course. Looks like I can add up to 256 Gb RAM. But the question is whether I should invest more time implementing sharding instead since it's more scalable?

pachanga
  • 2,863
  • 4
  • 28
  • 43
  • Does your game require to have all 300 million rows in the live table? My first idea would be to create a notion of archiving, and pushing less-important data in that archive table (same schema as the live one) – Wadih M. Mar 24 '11 at 20:10
  • Yep, it does. This table contains player specific items... Of course I can try figure out which players are not active for some time and move their stuff to another table...but it looks some sort of sharding ;) – pachanga Mar 24 '11 at 20:18
  • Where I work, we are running master servers with 96G of memory and 8 cores (16 with HT on) and NetApp storage. Slaves have 48G of memory, and 8/16 cores. Disks are RAID-10 with 6 SAS disks through a PERC controller. We do up to 23000 queries/sec on InnoDB, using MySQL 5.1.x. We are constantly monitoring query quality and put special effort into resolving queries that can cause pileups. In hardcore cases, this ends up using http://poormansprofiler.org/. All in all several hundred MySQL instances in some 3 dozen replication hierarchies. – Isotopp Apr 11 '11 at 06:26
  • There are no good answers.. so how do you manage your database currently? Maybe, big single table? – Sanghyun Lee Feb 18 '13 at 07:37

3 Answers3

1

Sharding seems reasonable if you need to have all 300m+ rows. It may be a pain to change now but when your table grows and grows there will be a point when no amount of ram will solve your problem. With such massive amounts of data it may be worth using something like couch db as you could store documents of data rather than rows ie 1 document could contain all records for an individual user.

Dan Watson
  • 90
  • 1
  • 9
0

Sounds to me like your main database table could use some normalization. Does all your information belong in that one table, or can you split it out to smaller tables? Normalization may invoke a small performance hit now, but as your table grows, that will be overwhelmed by the extra processing involved in accessing a huge, monolithic table.

dar7yl
  • 3,577
  • 23
  • 19
  • Database is properly normalized. The database has many tables, one of them is really huge and has nothing to do with normalization. – pachanga Mar 27 '11 at 20:08
0
I'm getting about 10-100 slow queries(where each may last about 5-6 seconds).

Quote of a comment: Database is properly normalized. The database has many tables, one of them is really huge and has nothing to do with normalization. When im reading this i would say it has to do with your queries.. has nothing to do with your hardware.. Average companies would dream about kind of server you have!

If you write bad queries doesn't matter how good your tables are normalized, it will be slow. maybe you got something about this, its almost a similar question with an answer(database is slow and stuff like that).

Also thought about archiving some stuff? For example from those 300 million it started with ID 1 so is that ID still get used? if not why not archive it to a other database or table(i would recommend database). I also believe that not every 700k users are logged in every day(if you got respect! but i don't believe that).

You also said 'This table contains player specific items' what kind of specific items?

Another question, can you post some of your 'slow' queries?

You also considered about a caching system from some data? that maybe changed once a month, like gear other game stuff?

Community
  • 1
  • 1
Yoram de Langen
  • 4,825
  • 2
  • 21
  • 30