Firewalling Web Services

Jan Newmarch (
Min Huang (
Kim Guan Chua (
Monash University

Web services are being promoted as the latest silver bullet for building distributed systems. However, there are both business and security issues involved in using or supplying Web services, and it may be necessary to provide mechanisms for blocking Web services. This paper discusses some of the reasons why network administrators may wish to do this, and outlines some work being undertaken at present to provide blocking mechanisms for Web services.


Web services promise to be the next wave of technology for offering distributed services to enterprises [1, 2]. A web service is deefined by the World Wide Web Consortium as [3]

A Web service is a software system identified by a URI, whose public interfaces and bindings are defined and described using XML. Its definition can be discovered by other software systems. These systems may then interact with the Web service in a manner prescribed by its definition, using XML based messages conveyed by internet protocols
There is much activity by a number of major companies to develop the technology and to implement services such as [4, 5, 6, 7] as well as the standardisation efforts of the W3 Consortium [8] and other groups such as the Web Services Interoperability Organisation [9]. A number of commentators point to Web services as a means of (finally) building robust, interoperable and scalable network applications employing reusable components.

Many aspects of Web services are still under development. One such area is security, and although the W3C claims this can be handled by XML encryption [10], there is concern among users [11] and attempts by vendors to file the gap, such as [12, 13]. Another incomplete area is transactions [14].

While there are many business models for ecommerce involving use of a browser [15], there appears to be very published concerning business models for Web services (defined as computer to computer communication using HTTP protocols). It is clear that some models such as advertising will not be suitable (since a program will not respond to an advert), and Bambury [16] argues that "the native internet economy and culture is largely free, disintermediated, ... and politically sophisticated" and that "it is quite possible that I-Commerce will fail to achieve what governments and business expect".

If commercial Web services are to succeed then they must address the issues of:

Some business models of Rappa [15] may be suitable for this, such as It is possible that other models may emerge which may also meet the profit and satisfaction goals.

Network administrators have a number of functions, which not only include enabling network access but also include disabling such access when necessary. The network administrator responsibilities include

This paper discusses aspects of Web services from the viewpoint of a network administrator. Security and business requirements point to a need to be able to selectively block Web services at the corporate firewall. Some techniques to do this are dealt with.


A number of network services are known to be security hazards. These include services such as remote shells, network file systems and even file transfer services. This is not counting virus threats posed by software which piggybacks on services such as email.

Firewalls are used to place barriers between the enterprise and the external world. These act at a variety of levels such as

Web services are just a new version of a very old technology: remote procedure calls [17]. A program running on one host makes a procedure call to a server running on another host. The second host will perform the computation and return the result. Other mechanisms include Sun RPC (renamed as the IETF Open Network Computing) [18], CORBA [19] and Java RMI [20]. From the network point of view the only distinguishing factor about Web services is that they use port 80 - the same port that is used by most Web servers that deliver HTML documents. (Minor distinguishing features are related to the poor design and high overheads of the set of Web mechanisms.)

Network administrators do not like remote procedure calls which are made external to the enterprise: they open holes in the firewall. For example, there are many documented problems in typical RPC services such as NFS (Network File System) [21]. Services that rely on RPC mechanisms are usually blocked for external access, and may even be disabled for local access.

Current work practices in many organisations make regular use of the Web, to post information, access information and to perform business transactions. Access to port 80 (the standard port for the Web) has to be left open or the business might suffer. Web services use this port to perform activities that would otherwise be closed off, and so exploit a weakness in work practices. This is the major factor in the viability of Web services: Web services can only work because weaknesses in firewall technology mean that blocking Web services will also block ordinary Web access. (Other RPC systems such as Java RMI also have mechanisms to use port 80, but do not attempt to make this exploit a selling point!)


Security paranoia is not just the only reason to block access to Web services. One of the possible business models for external services is a "pay by use" mechanism. These services are offered on a convenience basis, in that life becomes easier by using the service (just the same as using a credit card is easier than paying by cash). A typical example is validation of users to gain access to Web pages: access to an increasing number of Web sites requires individual user names and passwords (for "tracking purposes"). If this can be replaced by a single validation service then this will offer enormous convenience value to individuals, who may be willing to pay for this convenience.

Convenience will not come for free. Those offering Web services will impose charges for use of their services [22]. These services will act in the background, as part of some other activity and are unlikely to pop up a dialog box saying "please donate". Instead, they will simply deduct charges as they are incurred. These will be additional costs to the existing internet access costs. These costs will be incurred directly by the user, as there are no "advertising banners" in Web services whereby a third party (the advertiser) can be persuaded to pay the cost.

There are at least two major business costs using this model in existence already:

Telephone system
Employees in most organisations are allowed to make an unlimited number of "business related calls". This typically swallows a large percentage of an organisational budget, and is essentially an uncontrolled expense
Web access
This uses a similar model, of unregulated access to external costing agencies (the internet provider). It is more recent though, and the costs may be noticed more
These two systems have a common feature in that an individual employee is allowed to incur an essentially open-ended cost to the business, without having to use any regulatory mechanism such as a purchase request.

An IT manager who commits their organisation to yet another open costing model is brave indeed...

Leakage of information

Many companies specialise in gaining information about individuals and companies by tracking their Web usage. This is often done by techniques such as cookies or by "invisible" 1x1-pixel images.

Microsoft (as providers of Passport services) have in place a privacy policy which states that usage information of their Web services will not be misused [23]. The Windows Media Player has recently been discovered to be reporting usage patterns back to Microsoft, which decreases confidence in the Microsoft policies [24]. Microsoft have apparently dropped a key financial component (Hailstorm) due to difficulties in persuading major companies to use this service [25].

Other Web service providers may or may not have such policies in place. There are several issues:

Web service filter

These issues point to the need for network managers to have the ability to block access to external Web services or block external access to internal Web services. This ability may or may not be used, depending on the security policies for the organisation, but if it is required then tools must be available to implement the policy.

Acording to Robert Zalenski [26], there are six popular firewalls types: application-based firewall, packet filtering, stateful-inspection, proxy, network address translation, and virtual private network. Most organisations now use a proxy for outgoing Web requests and so we decided to use an HTTP proxy to filter requests for external Web services. Protection of internal services from external Web requests would need to be dealt with by a separate filter mechanism. We also deal only with unencrypted HTTP requests; SSL requests have opaque content and we cannot filter them [27].


Web service requests are sent as HTTP requests with content as a SOAP document (a particular XML type) [28]. SOAP is still evolving. SOAP 1.2 is nearing completion (at the time of writing). It defines SOAP requests using HTTP POST calls (but not HTTP GET). In addition, it specifies that the DocumentType must be set as application/soap. These two restrictions make it fairly straightforward (and with low overhead) to identify Web method requests.

Unfortunately, none of the Web services currently offered on sites such as seem to conform to SOAP 1.2. Some are using HTTP GET, while others omit the ContentType. Those using GET are very hard to spot, while those that omit the ContentType can be identified by attempting to parse them as SOAP documents and seeing if the parse succeeds or fails. This has high overheads for any POST request.

At present, there are very few (if any) Web services that are used by organisations, and we can expect that by the time Web services are widely deployed then they will mainly be compliant to SOAP 1.2. We really only need to concentrate on identifying SOAP requests at the 1.2 compliance level.

Squid filters

Squid [29] is widely used as an HTTP proxy. It is open source, so the code is available for modification. Squid also has a modular structure that allows different modules to be loaded, according to a confuration script.

Olaf Tits [30] has produced a module that provides several useful filters such as a JavaScript filter, cookie filter, redirect filter and content-type filter etc. All these filters are either deployed on request header, reply header, or reply content. The module allows filters to be added to one of three chains for request header, reply header, or reply content and all filters on each chain are executed.

The filter module does not filter on request content. This is needed to examine the ContentType and also the SOAP request itself. We added another filter chain to this module to process request content.

Filter policy and rules

Security policies often fall into two classes: allow anything that is not explicitly denied (weak security) or deny everything that is not explicitly allowed (string security). The filter can be set to enforce either of these policies.

A SOAP filter was written to parse a SOAP request and return a list of method calls made by it. The open source SOAP parser of Scott Seely [31] written in C++ was used for this.

The filter rules are currently of a simple form

    <flag> <url> <methodname>
such as

    d checkDomain
    a checkMail
which will deny requests for the method checkDomain while allowing requests for the method checkMail.

It would be a straightforward matter to extend the filter rule format to include wildcards or patterns for domains or methods, but would introduce some extra overheads.

Results and future work

We have modified some open source proxy and filter software to allow Web service requests to be allowed or denied when passed through an HTTP proxy. Our modifications are also available as open source. The filter works best with SOAP 1.2, but can handle SOAP 1.1 POST requests with additional overheads.

Work remaining to be done includes


[1] Wylie Wong, Margaret Kane, and Mike Ricciuti, "Web services" try to rise above din CNet,
[2] Staff Web Services: the Next Big Thing?
[3] W3 Consortium Web Services Architecture
[4] IBM developerWorks IBM WebSphere SDK for Web Services
[5] Sun Microsystems Java Technology and Web Services
[6] Microsoft Web Services Developer Center
[7] Oracle Web Services Technology Center
[8] W3 Consortium Web Services Activity
[9] WS-I Web Services Interoperability Organisation Home Page
[10] W3 Consortium Web Services Architecture Usage Scenarios
[11] Anne Chen Web Services Secure?,3959,497,00.asp
[12] Microsoft Web Services Enhancements (WSE)
[13] IBM developerWorks Specification: Web Services Security (WS-Security)
[14] IBM developerWorks Specification: Web Services Transaction (WS-Transaction)
[15] Michael Rappa Business Models on the Web
[16] Paul Bambury A Taxonomy of Internet Commerce
[17] A. S. Tanenbaum Modern Operating Systems Prentice-Hall 1992
[18] IETF ONC Remote Procedure Call
[19] Object Management Group
[20] Sun Microsystems Java Remote Method Invocation
[21] Vicki Brown, Dan Egnor NFS (in)security administration and information clearinghouse
[22] Mike Clark Making Money out of Selling Web Services - Part I
[23] Microsoft Microsoft .NET Passport Privacy Statement
[24] Richard M. Smith Serious privacy problems in Windows Media Player for Windows XP
[25] Robyn Weisman Microsoft Puts Hailstorm's 'My Services' on Hold
[26] Robert Zalenski, Firewall Technology, IEEE Potentials, IEEE, Feb/Mar 2002
[27] Netscape SSL 3.0 SPECIFICATION
[28] World Wide Web Consortium SOAP Version 1.2 Part 0: Primer
[29] Squid
[30] Olaf Titz Filter modules for Squid
[31] Scott Seely SOAP: Cross Platform Web Service Development Using XML Prentice-Hall 2002

Jan Newmarch (
Last modified: Tue Sep 16 09:12:27 EST 2003
Copyright ©Jan Newmarch