Web services are being promoted as the latest silver bullet for building distributed systems. However, there are both business and security issues involved in using or supplying Web services, and it may be necessary to provide mechanisms for blocking Web services. This paper discusses some of the reasons why network administrators may wish to do this, and outlines some work being undertaken at present to provide blocking mechanisms for Web services.
Web services promise to be the next wave of technology for offering distributed services to enterprises [1, 2]. A web service is deefined by the World Wide Web Consortium as 
A Web service is a software system identified by a URI, whose public interfaces and bindings are defined and described using XML. Its definition can be discovered by other software systems. These systems may then interact with the Web service in a manner prescribed by its definition, using XML based messages conveyed by internet protocolsThere is much activity by a number of major companies to develop the technology and to implement services such as [4, 5, 6, 7] as well as the standardisation efforts of the W3 Consortium  and other groups such as the Web Services Interoperability Organisation . A number of commentators point to Web services as a means of (finally) building robust, interoperable and scalable network applications employing reusable components.
Many aspects of Web services are still under development. One such area is security, and although the W3C claims this can be handled by XML encryption , there is concern among users  and attempts by vendors to file the gap, such as [12, 13]. Another incomplete area is transactions .
While there are many business models for ecommerce involving use of a browser , there appears to be very published concerning business models for Web services (defined as computer to computer communication using HTTP protocols). It is clear that some models such as advertising will not be suitable (since a program will not respond to an advert), and Bambury  argues that "the native internet economy and culture is largely free, disintermediated, ... and politically sophisticated" and that "it is quite possible that I-Commerce will fail to achieve what governments and business expect".
If commercial Web services are to succeed then they must address the issues of:
Network administrators have a number of functions, which not only include enabling network access but also include disabling such access when necessary. The network administrator responsibilities include
This paper discusses aspects of Web services from the viewpoint of a network administrator. Security and business requirements point to a need to be able to selectively block Web services at the corporate firewall. Some techniques to do this are dealt with.
A number of network services are known to be security hazards. These include services such as remote shells, network file systems and even file transfer services. This is not counting virus threats posed by software which piggybacks on services such as email.
Firewalls are used to place barriers between the enterprise and the external world. These act at a variety of levels such as
Web services are just a new version of a very old technology: remote procedure calls . A program running on one host makes a procedure call to a server running on another host. The second host will perform the computation and return the result. Other mechanisms include Sun RPC (renamed as the IETF Open Network Computing) , CORBA  and Java RMI . From the network point of view the only distinguishing factor about Web services is that they use port 80 - the same port that is used by most Web servers that deliver HTML documents. (Minor distinguishing features are related to the poor design and high overheads of the set of Web mechanisms.)
Network administrators do not like remote procedure calls which are made external to the enterprise: they open holes in the firewall. For example, there are many documented problems in typical RPC services such as NFS (Network File System) . Services that rely on RPC mechanisms are usually blocked for external access, and may even be disabled for local access.
Current work practices in many organisations make regular use of the Web, to post information, access information and to perform business transactions. Access to port 80 (the standard port for the Web) has to be left open or the business might suffer. Web services use this port to perform activities that would otherwise be closed off, and so exploit a weakness in work practices. This is the major factor in the viability of Web services: Web services can only work because weaknesses in firewall technology mean that blocking Web services will also block ordinary Web access. (Other RPC systems such as Java RMI also have mechanisms to use port 80, but do not attempt to make this exploit a selling point!)
Security paranoia is not just the only reason to block access to Web services. One of the possible business models for external services is a "pay by use" mechanism. These services are offered on a convenience basis, in that life becomes easier by using the service (just the same as using a credit card is easier than paying by cash). A typical example is validation of users to gain access to Web pages: access to an increasing number of Web sites requires individual user names and passwords (for "tracking purposes"). If this can be replaced by a single validation service then this will offer enormous convenience value to individuals, who may be willing to pay for this convenience.
Convenience will not come for free. Those offering Web services will impose charges for use of their services . These services will act in the background, as part of some other activity and are unlikely to pop up a dialog box saying "please donate". Instead, they will simply deduct charges as they are incurred. These will be additional costs to the existing internet access costs. These costs will be incurred directly by the user, as there are no "advertising banners" in Web services whereby a third party (the advertiser) can be persuaded to pay the cost.
There are at least two major business costs using this model in existence already:
An IT manager who commits their organisation to yet another open costing model is brave indeed...
Many companies specialise in gaining information about individuals and companies by tracking their Web usage. This is often done by techniques such as cookies or by "invisible" 1x1-pixel images.
Other Web service providers may or may not have such policies in place. There are several issues:
These issues point to the need for network managers to have the ability to block access to external Web services or block external access to internal Web services. This ability may or may not be used, depending on the security policies for the organisation, but if it is required then tools must be available to implement the policy.
Acording to Robert Zalenski , there are six popular firewalls types: application-based firewall, packet filtering, stateful-inspection, proxy, network address translation, and virtual private network. Most organisations now use a proxy for outgoing Web requests and so we decided to use an HTTP proxy to filter requests for external Web services. Protection of internal services from external Web requests would need to be dealt with by a separate filter mechanism. We also deal only with unencrypted HTTP requests; SSL requests have opaque content and we cannot filter them .
Web service requests are sent as HTTP requests with content as a SOAP
document (a particular XML type) . SOAP is still evolving. SOAP 1.2
is nearing completion (at the time of writing). It defines SOAP requests
POST calls (but not HTTP
addition, it specifies that the
DocumentType must be set
application/soap. These two restrictions make it fairly
straightforward (and with low overhead) to identify Web method requests.
Unfortunately, none of the Web services currently offered on sites such
www.xmethods.com seem to conform to SOAP 1.2. Some are
GET, while others omit the
GET are very hard to spot, while those that
ContentType can be identified by attempting to
parse them as SOAP documents and seeing if the parse succeeds or fails.
This has high overheads for any
At present, there are very few (if any) Web services that are used by organisations, and we can expect that by the time Web services are widely deployed then they will mainly be compliant to SOAP 1.2. We really only need to concentrate on identifying SOAP requests at the 1.2 compliance level.
Squid  is widely used as an HTTP proxy. It is open source, so the code is available for modification. Squid also has a modular structure that allows different modules to be loaded, according to a confuration script.
The filter module does not filter on request content. This is needed
to examine the
ContentType and also the SOAP request
itself. We added another filter chain to this module to process
Security policies often fall into two classes: allow anything that is not explicitly denied (weak security) or deny everything that is not explicitly allowed (string security). The filter can be set to enforce either of these policies.
A SOAP filter was written to parse a SOAP request and return a list of method calls made by it. The open source SOAP parser of Scott Seely  written in C++ was used for this.
The filter rules are currently of a simple form
<flag> <url> <methodname>
which will deny requests for the method
d http://220.127.116.11:9090/soap checkDomain a http://18.104.22.168:9090/soap checkMail
checkDomainwhile allowing requests for the method
It would be a straightforward matter to extend the filter rule format to include wildcards or patterns for domains or methods, but would introduce some extra overheads.
We have modified some open source proxy and filter software to allow Web service requests to be allowed or denied when passed through an HTTP proxy. Our modifications are also available as open source. The filter works best with SOAP 1.2, but can handle SOAP 1.1 POST requests with additional overheads.
Work remaining to be done includes