Calculations and Stuff
Let us make some rough assumptions. Very rough, just to get an idea.
First let us assume they use MySQL.Â
Using the documents here, we can estimate the data sizes:
MySQL Storage Requirements
Each house will need the following data values, using the largest integer storage of bigint:
House number (8 bytes)
Owner number (8 bytes)
Locked/unlock (tinyint, 1 byte)
House name (Varchar, assume 15 characters max, so 16 bytes)
House description (Varchar, assume 120 characters max, so 121 bytes)
----------------
Secondary functions:
Ownership:
Additional owners. A separate table.
House number (8 bytes primary index)
Player Number (Secondary owners) (8 bytes)
Access Level (for varying degrees of editing freedom) (8 bytes)
Items:
Assume 200 items per house (The contents of this table will be multiplied by 200)
House number (8 bytes primary index)
ItemID (8 bytes)
Bind state (1 byte)
Colour (8 bytes)
X-coords (Use Double: 8 bytes)
Y-coords (Use Double: 8 bytes)
Rotation (Use Double: 8 bytes)
On Table/other item (other items itemid, 8 bytes)
----
The total data cost of one house would be approximately 1.158*10^-5 gb.
The above number times 3 million players is
35.74 GB.
Ah, you might say, it is not the data storage that is the problem but the time it takes to retrieve it!
In this
benchmark tool comparing mysql and mongodb showing the comparison of fetching 5,000 rows out of 10,000,000 rows as being approximately 565 ms with two threads. MongoDB was able to retrieve the same results in approximately 2ms.
Using more cores speeds up this time.
This would be around accessing 12 houses at any given time. (5,000 rows to select, assuming each house would consume about 400 rows) This is also assuming that the houses are stored on one central housing server, and not on their respective servers (Such as Balmung or whatever)
Add some layer of housing caching on it (only update when someone is editing) this number will get faster and faster.
Am.. I completely wrong or am I missing something?
Because it should only take around 35GB of data which is almost negligible for a database these days with big data being the new CS fad.
If you see any corrections/improvements/etc please let me know. I am trying to think really how much space/power this would take. Not looking for optimizations, since I am trying to estimate worst-case.