Over the past five years, we've used several products to analyze our web
server traffic and generate reports for individual departmental sites,
all of which in the end seem to have fallen short of our needs. The
products we've used are:
-- analog, a freeware, command-line based product able to look at
individual log files
-- net.analysis, which stores logs in a database and generates emailed
attachments of reports (cost has increased more than 30-fold)
-- WebTrends (we're having problems with the scheduling tool and with
generating reports in the volume that we require)
It seems universities are caught in the middle of the web traffic
analysis market: we have 'enterprise level' needs (heterogenous web
servers serving out millions of page views, requiring dozens of
different reports be generated monthly), but only low-to-middle level
budgets to afford the proper analysis tools. What are other universities
doing for web analysis?
Thanks. I'll summarize if there's interest.
Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/memdir/cg/.