NO.1准备
工具准备
1.xray
xray (https://github.com/chaitin/xray) 是从长亭洞鉴核心引擎中提取出的社区版漏洞扫描神器,支持主动、被动多种扫描方式,自备盲打平台、可以灵活定义 POC,功能丰富,调用简单,支持 Windows / macOS / Linux 多种操作系统,可以满足广大安全从业者的自动化 Web 漏洞探测需求。
2.crawlergo
https://github.com/0Kee-Team/crawlergo,crawlergo是一个使用chrome headless模式进行URL入口收集的动态爬虫,同时,依靠智能URL去重模块,在过滤掉了大多数伪静态URL之后,仍然确保不遗漏关键入口链接,大幅减少重复任务。
3.w13scan
https://github.com/w-digital-scanner/w13scan,W13scan是基于Python3的一款开源的Web漏洞发现工具,它支持主动扫描模式和被动扫描模式,能运行在Windows、Linux、Mac上。
4.chromium
https://download-chromium.appspot.com/,Chromium是由Google主导开发的网页浏览器。
NO.2配置
手工挖洞,担心挖不到?双手离开键盘前来一遍360爬虫+Xray+w13scan被动扫描,没有那就真没有了,有你也挖不到。
Xray/w13scan均为web漏洞扫描器.
W13scan 配置
pip3 install -r requirements.txt
cd W13SCAN # 进⼊源码⽬录
python3 w13scan.py -h
用法:
usage: w13scan [options]
optional arguments:
-h, --help show this help message and exit
-v, --version Show program's version number and exit --debug Show programs's exception
--level {1,2,3,4,5} different level use different payload: 0-5 (default 2)
Proxy:
Passive Agent Mode Options
-s SERVER_ADDR, --server-addr SERVER_ADDR
server addr format:(ip:port)
Target:
options has to be provided to define the target(s)
-u URL, --url URL Target URL (e.g. "http://www.site.com/vuln.php?id=1")
-f URL_FILE, --file URL_FILE
Scan multiple targets given in a textual file
Request:
Network request options
--proxy PROXY Use a proxy to connect to the target URL
eg:[email protected]:8080 or [email protected]:1080
--timeout TIMEOUT Seconds to wait before timeout connection (default 30)
--retry RETRY Time out retrials times.
Output:
output
--html When selected, the output will be output to the output
directory by default, or you can specify
--json JSON The json file is generated by default in the output
directory, you can change the path
Optimization:
Optimization options
-t THREADS, --threads THREADS
Max number of concurrent network requests (default 31)
--disable DISABLE [DISABLE ...]
Disable some plugins (e.g. --disable xss sqli_error
webpack)
--able ABLE [ABLE ...]
Enable some moudle (e.g. --enable xss webpack)
xray 配置
Xray下载对应版本即可
用法:
USAGE:
[global options] command [command options] [arguments...]
COMMANDS:
webscan Run a webscan task
servicescan Run a service scan task
subdomain Run a subdomain task
poclint lint yaml poc
reverse Run a standalone reverse server
genca GenerateToFile CA certificate and key
upgrade check new version and upgrade self if any updates found
version Show version info
help, h Shows a list of commands or help for one command
GLOBAL OPTIONS:
--config FILE Load configuration from FILE (default: "config.yaml")
--log-level value Log level, choices are debug, info, warn, error, fatal
--help, -h show help
chromium 配置
使用crawlergo需要安装chromium,下载后为这样的:
crawlergo 配置
然后下载crawlargo,https://github.com/0Kee-Team/crawlergo
我是mac下载对应的版本:
NO.3实现
⾸先测试爬⾍,命令为crawlergo -c YourChromiumPath -t 标签⻚数 Url
./crawlergo -c /private/var/folders/p8/msmxt0v13h53qgsl65hx_wy00000gn/T/AppTranslocation/65BC47A4-01B8-4A03-8B41-B468CE47D9A1/d/Chromium.app/Contents/MacOS/Chromium -t 10 --request-proxy http://127.0.0.1:5555 http://testhtml5.vulnweb.com
测试与xray联动:
⾸先监听:
./xray_darwin_amd64 webscan --listen 127.0.0.1:5555 --html-output 666.html
测试与w13scan联动:
首先监听
python3 W13SCAN/w13scan.py -s 127.0.0.1 --html
与前⾯爬⾍⼀样的,只是多了个代理
./crawlergo -c /private/var/folders/p8/msmxt0v13h53qgsl65hx_wy00000gn/T/AppTranslocation/65BC47A4-01B8-4A03-8B41-B468CE47D9A1/d/Chromium.app/Contents/MacOS/Chromium -t 10 --request-proxy http://127.0.0.1:7778 http://192.168.31.229/test/
查看输出的html文件,里面有漏洞URL,漏洞类型,漏洞参数,漏洞利用的payload,还有请求信息.
NO.4参考
https://github.com/w-digital-scanner/w13scan
https://github.com/0Kee-Team/crawlergo
https://github.com/timwhitez/crawlergo_x_XRAY
本文始发于微信公众号(山石网科安全技术研究院):crawlergo结合xray和w13scan被动扫描实现自动化挖洞
- 左青龙
- 微信扫一扫
-
- 右白虎
- 微信扫一扫
-
评论