If you believe that popular, trusted websites like Facebook and PayPal are not vulnerable to exploits from previous eras then you are mistaken. Research suggests that various popular websites and online services are vulnerable to an exploit that was discovered way back in 1998 and it has made a comeback lately. The flaw, which has been named ROBOT, was discovered by Daniel Bleichenbacher in 1998.
ROBOT stands for the Return Of Bleichenbacher’s Oracle Threat. The new version of ROBOT was identified recently during by researchers Hanno Böck, Juraj Somorovsky and Craig Young during Facebook bug bounty program. Upon discovering this vulnerability, researchers were paid a considerable reward, which is not yet disclosed by the social network or the researchers. The findings were published on Tuesday.
ROBOT is found in the TLS/transport layer security protocol and it is now impacting all the leading websites as attackers can decrypt encrypted data and use the private encryption key of the site to sign communications. TLS protocol is used to conduct web encryption while the flaw is identified in the algorithm that handles RSA encryption keys.
The attack involves the use of exclusively created queries that generate errors on TLS servers in the form of Yes or No answers. the technique is called an adaptive chosen-ciphertext attack. These servers are responsible for protecting the communication between user’s browser and a website by decrypting HTTPS traffic. If the attack is successful, the attacker can passively monitor and record traffic too. It is also possible to carry out a man-in-the-middle-attack using this flaw.
The same technique was used to exploit the ROBOT vulnerability identified in 1998 however, the original ROBOT patch didn’t replace the unprotected RSA algorithm but the TLS standard was modified to make brute-force guessing much harder.
“After Bleichenbacher’s original attack the designers of TLS decided that the best course of action was to keep the vulnerable encryption modes and add countermeasures. Later research showed that these countermeasures were incomplete leading the TLS designers to add more complicated countermeasures. The section on Bleichenbacher countermeasures in the latest TLS 1.2 standard (7.4.7.1) is incredibly complex. It is not surprising that these workarounds aren’t implemented correctly,” the researchers wrote in their blog post.
After the ROBOT patch was released, the vulnerability has received several variations; such as in March 2016 another vulnerability related to ROBOT exposed around 33% of HTTPS connections to attackers. It was called the DROWN vulnerability.
Researchers claim that numerous vendors have failed to implement appropriate countermeasures to thwart attacks that aim at exploiting ROBOT. They wrote that the vulnerable implementations have been identified in seven vendors so far. These include F5, Cisco, and Citrix. While they also noted that some very popular websites on the internet have been affected including Facebook and PayPal. Several vulnerable subdomains have been identified on 27 of the top 100 domains, as per the ranking from Alexa.
Cisco in its advisory issued on Tuesday rated the vulnerability as Medium and that multiple products from Cisco have been affected such as the Cisco ACE30 Application Control Engine Module and ACE 4710 Application Control Engine Appliance. On the other hand, PayPal and Facebook issued patches in October 2017.
Various stopgap mitigation solutions have been offered by the researchers on their blog and they are also offering a testing tool to be implemented on public HTTPS servers along with a Python tool for testing the flaw.
“Most modern TLS connections use an Elliptic Curve Diffie Hellman key exchange and need RSA only for signatures. We believe RSA encryption modes are so risky that the only safe course of action is to disable them. Apart from being risky these modes also lack forward secrecy.”
The list of some of the sites affected by ROBOT flaw is available here.