Online Web Server Log Analyzer

ABSTRACT:

The vision of the Web server Log Analyzer with on request Reporting is to keep up the system movement points of interest in a long document, which infers markers about who, when and how a web server is gone by.

Objective

The extent of the Real-time Webserver Log Analyzer with on-request Reporting is as per the following:

ü Daily Traffic – Displays downloads every day.

ü Hourly Traffic – Displays downloads every hour.

ü Referrer – Displays URLs that were dynamic before documents were downloaded.

ü Browser – Displays the alluding URL when you tap on a line in the Referrer report.

ü DLs (Downloads) – Displays the circumstances that records have been downloaded.

ü UAs (User Agents) – Displays the User Agents that are getting to the site. A User Agent is the name of the program that is asking for pages on a site. Generally, User Agents allude to web programs.

ü Accesses – Displays the quantity of gets to and bytes downloaded by every client.

ü Searches – Displays the pursuit questions that clients have submitted to web crawlers.

ü Search Words – Displays words utilized as a part of quests.

ü Visitors – Displays the quantity of interesting guests to the site.

ü Countries – Displays guests’ nations and the quantity of solicitations and bytes downloaded.

ü Status Codes – Displays status codes for HTTP asks.

ü Errors – Displays mistake codes for HTTP asks.

Definition, Acronyms, Abbreviations

URL – Uniform Resource Locator

Program – A web program is a product application that empowers a client to show and cooperate with content, pictures, recordings, music and other data regularly situated on a Web page at a site on the World Wide Web or a neighborhood.

Web server – A PC program that is in charge of tolerating HTTP asks for from customers, which are known as web programs, and serving them HTTP reactions alongside discretionary information substance, which ordinarily are site pages, for example, HTML records and connected articles (pictures, and so on.).

Log – more often than not web servers have added the ability of logging some nitty gritty data, about customer solicitations and server reactions, to log records; this enables the website admin to gather insights by running log analyzers on log documents.

Diagram

There are two kinds of log analyzers:

Post parsing revealing – The log documents are parsed and every one of the reports is produced after that – for the most part on a booked premise. This can put an awesome strain on a PC as the parsing and revealing are done in one go.

Constant, on-request Reporting – The log records are parsed to a database out of sight. A report is just produced when asked. This sort of analyzer is typically more suited for some clients as it puts less strain on a server.

The term web server can mean one of two things:

A PC program that is in charge of tolerating HTTP asks for from customers, which are known as web programs, and serving them HTTP reactions alongside discretionary information substance, which more often than not is website pages, for example, HTML records and connected articles (pictures, and so on.).

A PC that runs a PC program as depicted previously.

Basic highlights

The rack of web servers facilitating the My Opera Community website on the Internet. The Opera Community rack, as observed to one side. From the best, client document stockpiling (the substance of files.myopera.com), “bigma” (the ace MySQL database server), and two IBM cutting edge focuses containing multi-reason machines (Apache front closures, Apache back finishes, slave MySQL database servers, stack balancers, record servers, store servers and match up aces.

In spite of the fact that web server programs vary in detail, they all offer some fundamental basic highlights.

HTTP: each web server program works by tolerating HTTP asks for from the customer, and giving an HTTP reaction to the customer. The HTTP reaction typically comprises of an HTML report, however, can likewise be a crude record, a picture, or some other kind of archive (characterized by MIME-types); if some blunder is found in customer ask for or while endeavoring to serve the demand, a web server needs to send a mistake reaction which may incorporate some custom HTML or instant messages to better disclose the issue to end clients.

Logging: ordinarily web servers have likewise the ability to log some point by point data, about customer solicitations and server reactions, to log records; this enables the website admin to gather insights by running log analyzers on log documents.

By and by numerous web servers execute the accompanying highlights too:

Confirmation, discretionary approval ask for (demand of client name and watchword) previously enabling access to a few or all sort of assets.

Treatment of static substance (document content recorded in server’s filesystem(s)) and dynamic substance by supporting at least one related interfaces (SSI, CGI, SCGI, FastCGI, JSP, PHP, ASP, ASP .NET, Server API, for example, NSAPI, ISAPI, and so forth.).

HTTPS bolster (by SSL or TLS) to permit secure (scrambled) associations with the server on the standard port 443 rather than common port 80.

Content pressure (i.e. by gzip encoding) to lessen the measure of the reactions (to bring down transmission capacity utilization, and so on.).

Virtual facilitating to serve numerous sites utilizing one IP address.

Substantial document support to have the capacity to serve records whose size is more noteworthy than 2 GB on 32 bit OS.

Transmission capacity throttling to restrict the speed of reactions keeping in mind the end goal to not soak the system and to have the capacity to serve more customers.

Fundamental key execution parameters (estimated under a differing heap of customers and solicitations per customer) are:

ü Number of solicitations every second (contingent upon the kind of demand, and so forth.);

ü Latency reaction time in milliseconds for each new association or demand;

ü Throughput in bytes every second (contingent upon document estimate, reserved or not stored content, accessible system data transmission, and so on.).

Existing System

The present framework is the log documents parsed and every one of the reports are created after that – as a rule on a booked premise. This can put an extraordinary strain on a PC as the parsing and detail are done in one go.

Dis Existing System

The restrictions of the current framework:

ü Find out the quantity of solicitations every second

ü Doesn’t discover the Overload causes

ü Doesn’t give security

Proposed System

The proposed framework is intended to build up a Web Server Log Analyzer, which parses a log document from a web server (like Apache), and in light of the qualities contained in the log record, infers pointers about who, when and how a web server is gone by.

Issue Definition

This venture is gone for to build up a web server log analyzer, which can investigate the web server get to data. This is helpful to keep up the system movement subtle elements in a log record, which determines markers about who, when and how a web server is gone to.

Points of interest over Existing System

ü Daily Traffic – Displays downloads every day.

ü Hourly Traffic – Displays downloads every hour.

ü Referrer – Displays URLs that were dynamic before documents were downloaded.

ü Browser – Displays the alluding URL when you tap on a line in the Referrer report.

ü Downloads – Displays the circumstances that records have been downloaded.

ü User Agents – Displays the User Agents that are getting to the site. A User Agent is the name of the program that is asking for pages on a site. Typically User Agents allude to web programs.

ü Accesses – Displays the quantity of gets to and bytes downloaded by every client.

MODULE DESCRIPTION

Continuous, on-request Reporting –

The log records are parsed to a database out of sight. A report is just created when asked. This kind of analyzer is typically more suited for some clients as it puts less strain on a server.

The log documents are parsed to a database out of sight. A report is just created when asked. This sort of analyzer is typically more suited for some clients as it puts less strain on a server.

Logging:

generally, web servers have likewise the ability to log some point by point data, about customer solicitations and server reactions, to log documents; this enables the website admin to gather insights by running log analyzers on log records.

By and by numerous web servers execute the accompanying highlights too:

Validation, discretionary approval ask for (demand of client name and secret word) previously enabling access to a few or all sort of assets. Treatment of not just static substance (document content recorded in server’s document system(s)) yet of dynamic substance too by supporting at least one related interfaces (SSI, CGI, SCGI, FastCGI, JSP, PHP, ASP, ASP .NET, Server API, for example, NSAPI, ISAPI, and so forth.).

HTTPS bolster (by SSL or TLS) to permit secure (scrambled) associations with the server on the standard port 443 rather than regular port 80.

Content pressure (i.e. by gzip encoding) to diminish the span of the reactions (to bring down transmission capacity utilization, and so on.).

Virtual facilitating to serve numerous sites utilizing one IP address.

Vast document support to have the capacity to serve records whose size is more prominent than 2 GB on 32 bit OS.

Data transmission throttling to confine the speed of reactions keeping in mind the end goal to not soak the system and to have the capacity to serve more customers.

Capacities
The functions involved in the development of Real-time Webserver Log Analyzer with on-demand Reporting are:
ü Daily Traffic Report
ü Countries Report
ü Accesses Report
ü Searches Report
ü User Agents Report
Software Requirements
Operating System: Windows XP/2003
Programming Language: C#.net
Frame Work: ASP.net
Workbench: Visual Studio
Database: Access
Hardware Requirements
Processor: Pentium IV
Hard Disk: 40GB
RAM: 256MB

Project-Web Server Log Analyzer

Project Report

LEAVE A REPLY

Please enter your comment!
Please enter your name here