Home
Forums
New posts
Search forums
What's new
New posts
New profile posts
Latest activity
Members
Current visitors
New profile posts
Search profile posts
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Home
Forums
CARDING & HACKING
HOSTING & BOTNET
Apache logs 2023
Message
<blockquote data-quote="Cupper" data-source="post: 579" data-attributes="member: 22"><p><strong>ARTLAS</strong></p><p>ARTLAS is a real-time Apache log analyzer. Based on the top 10 OWASP vulnerabilities, this program detects attempts to exploit your web applications and notifies you or your incident response team via Telegram, Zabbix and Syslog / SIEM.</p><p></p><p>ARTLAS uses regular expressions from the PHP-IDS project to identify exploitation attempts.</p><p></p><p>For details and installation instructions, see the page: <a href="https://kali.tools/?p=4832" target="_blank">https://kali.tools/?p=4832</a></p><p></p><p>Unfortunately, this program is written in Python 2 and has not been updated for a long time.</p><p></p><p><strong>Analyzing logs with command line tools (Bash)</strong></p><p>It is very convenient to use a combination of Linux commands for quick analysis of logs. This will help determine, for example, from which IPs the most requests came.</p><p></p><p>In the following commands, replace the <strong>access_log</strong> file with the name of your Apache log file. You can specify the full path to this file, for example, <strong>/ var / log / httpd / access_log</strong>.</p><p></p><p>If the file is zipped, use <strong>zcat</strong> instead of <strong>cat</strong>:</p><p>Code:</p><p>zcat site.ru/logs/access_log.1.gz</p><p></p><p>If the command does not use <strong>cat</strong>, but the file is zipped, then you can slightly edit the command. For example, the following example processes the access_log file:</p><p>Code:</p><p>awk -F \ "'{print $ 6}' access_log | sort | uniq -c | sort -fr</p><p></p><p>The <strong>awk</strong> program (see Awk Tutorials), like most others, can accept data from standard input, so the same command can be rewritten as follows:</p><p>Code:</p><p>cat access_log | awk -F \ "'{print $ 6}' | sort | uniq -c | sort -fr</p><p></p><p>As you can see, it now has <strong>cat</strong> in it and therefore for compressed files this snippet can be used as follows:</p><p>Code:</p><p>zcat access_log.1.gz | awk -F \ "'{print $ 6}' | sort | uniq -c | sort -fr</p><p></p><p><strong>Search by arbitrary string</strong></p><p></p><p>The simplest example, search among requests by an arbitrary string (IP address, User-Agent, page address, etc.) using grep:</p><p>Code:</p><p>cat access_log | grep 'STRING'</p><p></p><p>To find all lines containing a specific response status, for example 403 (access denied):</p><p>Code:</p><p>cat access_log | grep '403'</p><p></p><p><strong>List of all user agents, sorted by the number of times they have appeared</strong>:</p><p>Code:</p><p>awk -F \ "'{print $ 6}' access_log | sort | uniq -c | sort -fr</p><p></p><p><strong>Analysis of various server responses and requests that triggered them:</strong></p><p>Code:</p><p>awk '{print $ 9}' access_log | sort | uniq -c | sort</p><p></p><p>The output shows how many types of requests your site has received. A "normal" request result is a code of 200, which means that the page or file was requested and delivered. But many other options are also possible </p><p>The most common answers are:</p><ul> <li data-xf-list-type="ul"><strong>200</strong> - OK</li> <li data-xf-list-type="ul"><strong>206</strong> - Partial Content</li> <li data-xf-list-type="ul"><strong>301</strong> - Moved Permanently</li> <li data-xf-list-type="ul"><strong>302</strong> - Found</li> <li data-xf-list-type="ul"><strong>304</strong> - Not Modified</li> <li data-xf-list-type="ul"><strong>401</strong> - Unauthorized (password required)</li> <li data-xf-list-type="ul"><strong>403</strong> - Forbidden</li> <li data-xf-list-type="ul"><strong>404</strong> - Not Found</li> </ul><p></p><p>A 404 error indicates a missing resource. Take a look at the requested URIs that got this error.</p><p>Code:</p><p>grep "404" access_log | cut -d '' -f 7 | sort | uniq -c | sort -nr</p><p></p><p>Another option for displaying the most frequently not found pages on the site:</p><p>Code:</p><p>cat access_log | awk '($ 9 ~ / 404 /)' | awk '{print $ 7}' | sort | uniq -c | sort -rn | head -n 25</p><p></p><p><strong>The IP addresses that made the most requests:</strong></p><p>Code:</p><p>cat access_log | awk '{print $ 1}' | sort | uniq -c | sort -rn | head -n 25</p><p></p><p>Top 25 IP addresses with the most requests showing their country:</p><p></p><p>Install the required dependencies:</p><p>Code:</p><p>sudo apt install geoip-bin geoip-database-extra</p><p></p><p>The command to display the country of the IP addresses that made the most requests to the server:</p><p>Code:</p><p>cat access_log | awk '{print $ 1}' | sort | uniq -c | sort -rn | head -n 25 | awk '{printf ("% 5d \ t% -15s \ t", $ 1, $ 2); system ("geoiplookup" $ 2 "| cut -d \\: -f2")} '</p><p></p><p><strong>To find sites that insert images of my site (when stealing articles, for example):</strong></p><p>Code:</p><p>awk -F\" '($2 ~ /\.(jpg|png|gif)/ && $4 !~ /^https:\/\/(|www\.)hackware\.ru/){print $4}' access_log | sort | uniq -c | sort</p><p></p><p>Remember to edit the domain name in the previous and next commands.</p><p></p><p>To analyze all archives:</p><p>Code:</p><p>zcat access_log. * gz | awk -F \ "'($ 2 ~ /\.(jpg|png|gif)/ && $ 4! ~ /^https:\/\/(|www\.)hackware\.ru/) {print $ 4}' | sort | uniq -c | sort</p><p></p><p><strong>Empty user agent</strong></p><p>An empty user agent usually indicates that the request is coming from an automated script. The following command will display a list of IP addresses for these user agents, and based on it, you can decide what to do with them next - block or allow access:</p><p>Code:</p><p>awk -F \ "'($ 6 ~ / ^ -? $ /)' access_log | awk '{print $ 1}' | sort | uniq</p><p></p><p><strong>Too much load from one source?</strong></p><p>When your site is under heavy load, you need to figure out if the load is coming from real users or something else:</p><ul> <li data-xf-list-type="ul">Setup or system problems</li> <li data-xf-list-type="ul">A custom app or bot is requesting information from your site too quickly</li> </ul><p>Displaying IP addresses sorted by the number of requests:</p><p>Code:</p><p>cat access_log | cut -d '' -f 1 | sort | uniq -c | sort -nr</p><p></p><p>10 most active IPs:</p><p>Code:</p><p>cat access_log | awk '{print $ 1; } '| sort | uniq -c | sort -n -r | head -n 10</p><p></p><p>Traffic in kilobytes by status codes:</p><p>Code:</p><p>cat access_log | awk '{total [$ 9] + = $ 10} END {for (x in total) {printf "Status code% 3d:% 9.2f Kb \ n", x, total [x] / 1024}}'</p><p></p><p>10 most popular referrers (don't forget to edit your domain name):</p><p>Code:</p><p>cat access_log | awk -F \ "'{print $ 4}' | grep -v '-' | grep -v '<a href="https://hackware.ru" target="_blank">https://hackware.ru</a>' | sort | uniq -c | sort -rn | head -n 10</p><p></p><p>10 most popular user agents:</p><p>Code:</p><p>cat access_log | awk -F \ "'{print $ 6}' | sort | uniq -c | sort -rn | head -n 10</p></blockquote><p></p>
[QUOTE="Cupper, post: 579, member: 22"] [B]ARTLAS[/B] ARTLAS is a real-time Apache log analyzer. Based on the top 10 OWASP vulnerabilities, this program detects attempts to exploit your web applications and notifies you or your incident response team via Telegram, Zabbix and Syslog / SIEM. ARTLAS uses regular expressions from the PHP-IDS project to identify exploitation attempts. For details and installation instructions, see the page: [URL]https://kali.tools/?p=4832[/URL] Unfortunately, this program is written in Python 2 and has not been updated for a long time. [B]Analyzing logs with command line tools (Bash)[/B] It is very convenient to use a combination of Linux commands for quick analysis of logs. This will help determine, for example, from which IPs the most requests came. In the following commands, replace the [B]access_log[/B] file with the name of your Apache log file. You can specify the full path to this file, for example, [B]/ var / log / httpd / access_log[/B]. If the file is zipped, use [B]zcat[/B] instead of [B]cat[/B]: Code: zcat site.ru/logs/access_log.1.gz If the command does not use [B]cat[/B], but the file is zipped, then you can slightly edit the command. For example, the following example processes the access_log file: Code: awk -F \ "'{print $ 6}' access_log | sort | uniq -c | sort -fr The [B]awk[/B] program (see Awk Tutorials), like most others, can accept data from standard input, so the same command can be rewritten as follows: Code: cat access_log | awk -F \ "'{print $ 6}' | sort | uniq -c | sort -fr As you can see, it now has [B]cat[/B] in it and therefore for compressed files this snippet can be used as follows: Code: zcat access_log.1.gz | awk -F \ "'{print $ 6}' | sort | uniq -c | sort -fr [B]Search by arbitrary string[/B] The simplest example, search among requests by an arbitrary string (IP address, User-Agent, page address, etc.) using grep: Code: cat access_log | grep 'STRING' To find all lines containing a specific response status, for example 403 (access denied): Code: cat access_log | grep '403' [B]List of all user agents, sorted by the number of times they have appeared[/B]: Code: awk -F \ "'{print $ 6}' access_log | sort | uniq -c | sort -fr [B]Analysis of various server responses and requests that triggered them:[/B] Code: awk '{print $ 9}' access_log | sort | uniq -c | sort The output shows how many types of requests your site has received. A "normal" request result is a code of 200, which means that the page or file was requested and delivered. But many other options are also possible The most common answers are: [LIST] [*][B]200[/B] - OK [*][B]206[/B] - Partial Content [*][B]301[/B] - Moved Permanently [*][B]302[/B] - Found [*][B]304[/B] - Not Modified [*][B]401[/B] - Unauthorized (password required) [*][B]403[/B] - Forbidden [*][B]404[/B] - Not Found [/LIST] A 404 error indicates a missing resource. Take a look at the requested URIs that got this error. Code: grep "404" access_log | cut -d '' -f 7 | sort | uniq -c | sort -nr Another option for displaying the most frequently not found pages on the site: Code: cat access_log | awk '($ 9 ~ / 404 /)' | awk '{print $ 7}' | sort | uniq -c | sort -rn | head -n 25 [B]The IP addresses that made the most requests:[/B] Code: cat access_log | awk '{print $ 1}' | sort | uniq -c | sort -rn | head -n 25 Top 25 IP addresses with the most requests showing their country: Install the required dependencies: Code: sudo apt install geoip-bin geoip-database-extra The command to display the country of the IP addresses that made the most requests to the server: Code: cat access_log | awk '{print $ 1}' | sort | uniq -c | sort -rn | head -n 25 | awk '{printf ("% 5d \ t% -15s \ t", $ 1, $ 2); system ("geoiplookup" $ 2 "| cut -d \\: -f2")} ' [B]To find sites that insert images of my site (when stealing articles, for example):[/B] Code: awk -F\" '($2 ~ /\.(jpg|png|gif)/ && $4 !~ /^https:\/\/(|www\.)hackware\.ru/){print $4}' access_log | sort | uniq -c | sort Remember to edit the domain name in the previous and next commands. To analyze all archives: Code: zcat access_log. * gz | awk -F \ "'($ 2 ~ /\.(jpg|png|gif)/ && $ 4! ~ /^https:\/\/(|www\.)hackware\.ru/) {print $ 4}' | sort | uniq -c | sort [B]Empty user agent[/B] An empty user agent usually indicates that the request is coming from an automated script. The following command will display a list of IP addresses for these user agents, and based on it, you can decide what to do with them next - block or allow access: Code: awk -F \ "'($ 6 ~ / ^ -? $ /)' access_log | awk '{print $ 1}' | sort | uniq [B]Too much load from one source?[/B] When your site is under heavy load, you need to figure out if the load is coming from real users or something else: [LIST] [*]Setup or system problems [*]A custom app or bot is requesting information from your site too quickly [/LIST] Displaying IP addresses sorted by the number of requests: Code: cat access_log | cut -d '' -f 1 | sort | uniq -c | sort -nr 10 most active IPs: Code: cat access_log | awk '{print $ 1; } '| sort | uniq -c | sort -n -r | head -n 10 Traffic in kilobytes by status codes: Code: cat access_log | awk '{total [$ 9] + = $ 10} END {for (x in total) {printf "Status code% 3d:% 9.2f Kb \ n", x, total [x] / 1024}}' 10 most popular referrers (don't forget to edit your domain name): Code: cat access_log | awk -F \ "'{print $ 4}' | grep -v '-' | grep -v '[URL]https://hackware.ru[/URL]' | sort | uniq -c | sort -rn | head -n 10 10 most popular user agents: Code: cat access_log | awk -F \ "'{print $ 6}' | sort | uniq -c | sort -rn | head -n 10 [/QUOTE]
Name
Verification
Post reply
Home
Forums
CARDING & HACKING
HOSTING & BOTNET
Apache logs 2023
Top