Data Mining
General Question Time:
How do you guys gather data during alpha/beta for analysis/balancing?
Do you notify people you are doing that?
What kind of data do you gather?
Thanks.
How do you guys gather data during alpha/beta for analysis/balancing?
Do you notify people you are doing that?
What kind of data do you gather?
Thanks.
Comments
Valuable Analytic Data from my experience:
Basics are:
DAU Daily Active Users
Retention Funnel - % of players that are still in the game at each point of interaction during the first session
for example, before loading screen, after loading screen, after first tutorial instructions, after execution of tutorial instructions, etc. The more small steps you track the more you can identify problems and fix them.
First Session retention - as mentioned above
1 Day retention, how many players return after the first day
3 Day retention - if 1 and 3 day retention is bad, it means that your core mechanics are the problem and you should scrap and restart.
7 Day retention
14 Day retention
21 day retention
30 day retention
60 day retention
90 day retention and so on
You can also track
Session length,
Sessions per day.
Average time of day per session - This you have to be careful as you have to work in averages, so split the day up into morning, afternoon, evening and very late at night. With this data you can better understand the player and what experience they are looking for.
Then from here it depends on your game. In combat games I liked to track:
Fail rate - how many retries it took to beat a check point
death rate vs npc
Loot drops if any
time per battle
shots to kill per npc
shots to kill player per npc
I think you get the picture. Anything that has a read out you can measure, you should group either by level or by npc, depending on what you have and what makes sense.
I track these numbers to match with my balancing to see if my Master Document's formulas are correct in their predicted read outs.
If you were creating your Master Document with graphs you can match the analytic data with the graphs and instantly see the result.
Also, don't roll your own.
Looking forward to Wednesday then @dislekcia!
@iPixelPierre: I believe most platforms require you to have a privacy policy if you are sending any kind of data? Im not sure though. It is possibly best to have a privacy policy if you track any data at all anyway (to be legally covered).
another well know tool is
http://www.flurry.com/
I hope one of these are what you are looking for. :P
Just sending data to a php/json page and using that to submit into mysql.
Works fine for what I want to do at this early stage in development.
It's not a complicated system, we just ping information to a URL from the game and go from there. It's related to our crash information collection system that we used during the beta (and that's still running, no crash dumps since early Jan, hah). It does require people to be online though and sometimes doesn't complete because it fails silently and we don't bother caching stuff on the client-side. But it's not super essential to us figuring out the game at this stage, that's what the beta was for.
For balance we made the game obscenely difficult and ratcheted that down as we wanted players to explore more of the end-game once the runaway strategies had been discovered and nerfed.
We didn't do stuff like move buttons around via A/B, when it was time to do UI stuff like that we just gave people both UI options as a setting and randomised what it started at. The plan was to use the most popular option at the end, but we kinda just left most of those settings in because people liked them both ways and we'd already done the setup anyway ;)
I'm not super-convinced that A/B testing is too useful in actual game design. It's good for optimising towards specific measured targets, so I guess placing monetisation buttons and tutorial progression and stuff, but we just did the old time-separated testing and saw constant improvement.
On the A/B testing of core mechanics, I actually worked a crap load with this recently and I found that it is actually the most amazing thing to see. I think to some degree you could argue, as you well did, that intuitive design can already tell you that these things will be a success, but it is really great to see it in concrete comparable results so you can reason through them for a better understanding of the player's mindset. For me the main great thing was to see the effect you could have on players coming back, just simply by denying them one feature, or adjusting the difficulty. We did a test on our missions where we were able to identify specific missions that caused players to drop, so we changed the missions and improved our retention by a big amount (sorry can't say how much NDAs and all that). Overall for me it has been an awesome advantage.