The Hidden Dangers of Web Services
Its meant to bring a certain agnosticism to technology and standardize the transmission of data and information through XML and SOAP. Despite what web services and the resulting service-oriented architecture (SOA) delivers now and promises to deliver in the near future the technology still has a major hurdle to overcome: Security.
The security concerns that come bundled with SOA are similar to the threats that rose with the growth of the Internet. In fact, there are several security parallels to be drawn for systems between Internet connectivity and SOA.
Both allow more direct communication to potentially vulnerable code residing on the backend. Both were adopted faster than one could manage the risk: SOA in particular creates a level of connectivity that was never designed for by the makers of the legacy applications it often exposes.
Mainframe terminal application developers, for example, were most concerned with performance, accidental entries, and reliability because their users were trusted and hard-wired. Connectivity removes trust from the operating environment.
The SOA risks are often very individualized, taking different forms at every instance and in every company. The result is, in many cases, security landmines waiting to be triggered by the treading attacker.
If web services are implemented with security in mind though, they can fundamentally reduce risk through proper filtering and limiting the exposed surface of the application to the outside world.
Data Gone Wild
Most software security vulnerabilities come from assumptions about data not being enforced. As an example, consider an application that processes customer information, and one particular field, Customer Surname, that is assumed to be no longer than 20 characters.
The application typically receives this data from a client application that checks to make sure no more than 20 characters are passed to the server application. Now imagine putting a web services interface on that server application.
Data is transmitted as part of an XML document, with likely no constraint. If a client sends a request with more than 20 characters, the result could be a potentially exploitable application fault such as a buffer overflow.
This means when software is exposed through web services we need to take care that data is properly constrained. The ideal case is, of course, to validate data within the application but, for legacy systems, this filtering may not exist and no longer be a feasible option.
Beyond data being sent to the server, when a web service is created an implicit contract is forged between the provider and user application about the format and range of data exchanged.
Back-end server code may be changed which alters response data. A client may be built assuming it will receive a fixed-sized response. If the implementation changes, client applications may be at risk. When implementing web services then, it is critical to establish a set of data boundaries and those boundaries remain consistent even after plumbing changes of the underlying code.
Walk Down the Right Paths
When moving from a controlled client to a web services approach one faces the possibility that functions may be called out of sequence. Think about a simple every-day process like making a cup of coffee and imagine if we had software functions for the key activities:
Executed in this order, we get a decent cup of coffee. The process has some tolerance for failure too in that the order of pour_coffee() and pour_cream() arent particularly important but the fact that they both precede stir() is.
While existing software might have forced users to do these steps in order, slapping a web services front-en onto the application means careless users or attackers might be able to execute them out of order because they may be able to access these functions directlywith unspecified results.
If you carry this analogy to financial transactions, there may be some disastrous implications of allowing activities to be done out of sequence.
One of the biggest mistakes when rolling out web services is failing to look at how the system will be deployed. A common pitfall is to assume your application, transactions, and data will be protected by existing enterprise defenses.
An interesting property of web services transactions is data often flies by such defenses without any real inspection. Given that transactions happen as globs of XML, network defenses lack the contextual information to determine if a specific message is hostile or contains data that can potentially cause the application to fail.
Another common snag is failing to test the security of both the client and the server. One must always protect the server from malicious data being sent from a user but it is equally important to make sure client applications are robust and can handle responses to web services requests that may be generated by an attacker impersonating the server.
Web services continue to change the way applications interact. The approach offers a solution to the problem of having applications communicate without needing to consider their implementation.
Like any new technology, it brings benefit as well as risk to the IT environment.
As more information flows through HTTP, we need to consider not just the code thats written but how that shift in communications conflicts with or negates network defenses that may already be in place.
As with any new technology trend it is important to remember the law of attacker economics: The more a technology gets used, the more it will be attacked. If built with security in mind, though, SOA has the ability to reduce risk by shielding an application instead of simply exposing it to would-be attackers.
Dr. Herbert Thompson is chief security strategist at Security Innovation.