I am developing a statistics module for my website which will help to measure the conversion rate and other interesting data.
The mechanism I use to store the database entry in a data table - Every time a user enters a specific area in my DB (I avoid duplicate records with the help of cookies ).
For example, I have the following zones:
- Website - A normal zone which used to count unique users because I recently stopped believing in Google Analytics. Had done it.
- Category - Self Descriptive <
- Minsk - Self Descriptive.
- Product Image - When a user watches the product and lead submission form.
The problem is after one month, my data is packed with table too of rows, and I load data on ASP.Net pages In fact was written to slow down.
I thought writing a service which would parser the data in some way, but I can not see it without any reason for the lawn which is flexible.
My question:
- Large scale data-intensive application - such as loading Google Analytics data so quickly?
- What's the best way for me to do this?
- Maybe my DB design is wrong and should I put the data in just one table?
Thanks for any help,
Aatan.
The basis on which you are looking for is called aggregation .
You are interested in some functions in the calculation of your data, and instead of calculating the data "online" when you start the display website, you calculate offline, either through the batch process Night or Enhanced when the log record is written.
A simple increase will be to calculate per user / session rather than accumulating and counting each hit. This will reduce your analytical processing requirements by a factor in the sequence of hits per session. Of course, processing costs will increase while inserting log entries.
Another type of aggregation is said to be collected along with some dimensions of your data and connects other dimensions to users in a browsing mode. It closes performance, storage and flexibility.
Comments
Post a Comment