Contents

Introduction

One of the key features in Microsoft® SQL Server™ 2000 Analysis Services is the capability to access data from the Web. The PivotTable® Service (PTS) OLE DB provider uses a special Active Server Pages (ASP) Web page to send and receive data using the HTTP or HTTPS protocol. The ASP page uses a single COM object, called the data pump, which acts as a conduit for Analysis Services data. The library that contains this COM object is referred to as the data pump library.

Analysis Services originally required for Internet Information Services (IIS) to be installed on the same server, because the data pump library used shared memory to communicate directly with the Analysis server. With the release of Service Pack 3 (SP3) for SQL Server 2000 Analysis Services, however, the data pump library can also use a named pipe to share information with the Analysis server. The use of named pipes allows installation of the data pump library on a separate computer running IIS, thereby increasing the security and stability of both IIS and Analysis Services.

This paper explains the data pump architecture in more detail, describes how to install it on a separate computer, and discusses the implementation considerations of the new data pump implementation in SP3. The paper assumes the reader has a working knowledge of SQL Server 2000 Analysis Services, the VBScript scripting language, and Internet concepts, including familiarity with HTTP connectivity.

HTTP Connectivity and Analysis Services

To use HTTP connectivity when connecting to an Analysis server from a client application, provide a URL to PTS. PTS then attempts to access a specific ASP page, named msolap.asp, at the Web site indicated by URL or IP address. Although you can place the msolap.asp file in the root virtual directory, we recommend using a separate virtual directory (e.g., http://myserver/app1) for security reasons. The system always appends the string "/msolap.asp" to whatever URL is specified. Msolap.asp provides access to the data pump object, named PUPump, implemented in the msmdpump.dll data pump library. The PUPump object communicates directly with the Analysis server using the msolap.asp page to pass data and meta data back to PTS.

Figure 1 describes the flow of data when using the data pump in Analysis Services.

Figure 1. Data flow using the data pump in Analysis Services

A client application attempts to access Analysis Services using PTS. If a URL is specified in the Data Source connection string property of PTS, PTS appends the string "/msolap.asp" to the URL and uses HTTP to send a POST request to the msolap.asp page, including the connection string and other data and meta data as the binary part of the POST request.

The IIS server receives the POST request and, if the request is successfully validated, instantiates the msolap.asp page. As part of that validation, IIS security settings establish the security context in which the msolap.asp page is executed. If Basic authentication is used, PTS will exchange HTTP header messages with IIS, passing the user name and password. IIS then impersonates that account on the Web server.

The msolap.asp page attempts to instantiate an instance of the PUPump object from the Analysis Services data pump library.

The PUPump object reads the data from the HTTP stream sent by PTS as part of the POST request into a SAFEARRAY buffer (using the BinaryRead method of the Request ASP object) and attempts to send the buffer to the specified Analysis server.

IMPORTANT: The PUPump object does not examine the underlying data contained in the HTTP stream in any way. The actual underlying data structures within the buffer, such as the connection string, are never examined, nor are they available to any of the code in the data pump library itself.

The Analysis server processes and carries out the request made by the data pump.

The Analysis server returns a response containing data and meta data to the data pump library.

The PUPump object formats the data and meta data into a SAFEARRAY buffer.

The PUPump object writes the data back into the HTTP stream (using the BinaryWrite method of the Response ASP object) as part of the POST response.

The HTTP stream is sent back to PTS for processing.

The data pump library can be configured to use the shared memory transport on the same computer, or the data pump library can be moved to another computer and configured to use the named pipe transport to communicate with the Analysis server. If you move the data pump library to another computer, you must use the named pipe transport to communicate with the Analysis server. Although it is possible to use the named pipe transport on the same computer, it is not recommended and should only be done for testing purposes.

Computer Configurations

Figure 1 describes the logical, but not physical architecture of HTTP connectivity with IIS and Analysis Services. The physical architecture can be constructed in a multitude of ways, broken out into two basic configuration groups.

Single-computer configurations

A single-computer configuration consists of a single computer that hosts both IIS and Analysis Services and is used to provide easy network and HTTP connectivity for users. When a user connects to a single-computer configuration, IIS communicates directly with the Analysis server running on the same computer using shared memory files. Figure 2 illustrates a typical single-computer configuration.

Figure 2. Hosting Analysis Services and IIS on the same computer

Multiple-computer configurations

A multiple-computer configuration consists of one or more computers that host IIS and one or more separate computers that host Analysis Services. A multiple-computer configuration can also involve firewalls, additional servers, and so on. This configuration is used to provide greater security and stability for both network and HTTP connectivity. When a user connects to a multiple-computer configuration, IIS can communicate either directly with an Analysis server or with another IIS server, depending on the configuration. Figure 3 illustrates three different multiple-computer configurations.

Figure 3. Hosting Analysis Services and IIS on more than one computer

For more information about managing network architectures and security involved with multiple-computer configurations, see Network Architecture and Security later in this paper.

Transport Configurations

The transport configurations used by IIS and Analysis Services depend largely on the computer configuration employed by your implementation.

Shared memory transport

Originally, the data pump library used a transport mechanism based on shared memory files to communicate with the Analysis server. A shared memory file allows two or more applications to share memory by mapping a section of virtual memory to a file using a file-mapping object, then using file views to manipulate the shared memory file. In the case of Analysis Services, the system pagefile is used as a shared memory file to pass information between the data pump library and the Analysis server.

The shared memory transport is highly recommended for single-computer configurations for security and performance reasons. The shared memory transport is limited, however, by the fact that both the data pump library and the MSSQLServerOLAPServices service must be running on the same computer, so it cannot be used for multiple-computer configurations.

If you want to use the data pump library with the shared memory transport, see Configuring IIS to Use the Data Pump later in this paper for a description of the installation and configuration steps involved with using the data pump library with a single-computer configuration. The data pump library is already installed, and no changes to msolap.asp are required to use the shared memory transport in a single-computer configuration. "Installing the Data Pump Library" and "Configuring the Data Pump" can therefore be ignored.

Named pipe transport

In SP3, the data pump library can now use a transport mechanism based on named pipes to communicate with the Analysis server. The data pump library creates and uses the named pipe \\<ServerName>\pipe\PlatoNamedPipe to communicate with the Analysis server.

The named pipe transport is used for multiple-computer configurations. Although the named pipe transport can be used for single-computer configurations, it is not as efficient as the shared memory transport and is not recommended for use on such configurations. Because using this transport mechanism requires access to NetBIOS, you must also configure your network to allow NetBIOS traffic.

If you want to use the data pump library with the named pipe transport, the following sections describe the steps involved in moving the data pump library to another computer.

Installing the Data Pump Library

If you plan on using IIS and Analysis Services on the same computer, and you want to use shared memory to communicate between the data pump library and the Analysis server, no additional effort is required to use the data pump other than the construction of a Web site in IIS. The Analysis Services setup program automatically installs and registers the data pump library. For information about constructing a Web site in IIS for Analysis Services HTTP or HTTPS connectivity, see "How to Configure Analysis Services for the Web" in SQL Server 2000 Books Online.

If you want to use the data pump library on a separate IIS server, you must install and register the data pump library before you construct a Web site in IIS for Analysis Services HTTP or HTTPS connectivity. The following steps describe how to install and register the msmdpump.dll library on an IIS server:

Locate the msmdpump.dll file in the \Bin subdirectory of your Analysis Services installation. For most installations, the full path to this library is C:\Program Files\Microsoft Analysis Services\Bin.

Copy the msmdpump.dll file to a subdirectory on the target IIS server.

Register the msmdpump.dll file on the target IIS server.

Open a command prompt window.

Change the current directory to the subdirectory that contains the msmdpump.dll file.

Execute the following statement:

regsvr32 msmdpump.dll

Configuring IIS to Use the Data Pump

To use the data pump with single-computer or multiple-computer configurations, you must first construct a Web site or virtual directory in IIS to provide HTTP or HTTPS connectivity for Analysis Services. You must follow the instructions in this section when using either the shared memory transport or the named pipe transport.

IMPORTANT: This section assumes that you are familiar with IIS usage and practices, and are using the Microsoft Windows® 2000 operating system.

To prepare for IIS configuration, follow these steps:

Create a subdirectory in which to store the msolap.asp file. In the following steps, this directory will be referred to as the "data pump folder."

Locate the msmdpump.dll file in the \Bin subdirectory of your Analysis Services installation. For most installations, the full path to this library is C:\Program Files\Microsoft Analysis Services\Bin.

Copy the msmdpump.dll file to a subdirectory on the target IIS server.

To enable a new Web site to use the data pump library, follow these steps:

Using Internet Services Manager, create a new Web site. For more information about how to create a virtual directory, see the Internet Information Services documentation.

Set the Web site home directory to the data pump folder.

Select the Read check box so that Web server permissions for the Web site home directory allow users to view directory content.

Select Scripts only in the Execute Permissions list box so that msolap.asp can be executed from the Web site home directory.

Right-click on the new virtual directory, and select Properties to make further modifications.

To enable an existing Web site to use the data pump library, follow these steps:

Using Internet Services Manager, create a new virtual directory for the existing Web site. For more information about how to create a virtual directory, see the Internet Information Services documentation.

Set the content directory for the virtual directory to the data pump folder.

Select the Read check box so that Web server permissions for the virtual directory allow users to view directory content.

Select Scripts only in the Execute Permissions list box so that msolap.asp can be executed from the Web site home directory.

Right-click the new virtual directory, and select Properties to make further modifications.

You should change the default security settings when IIS creates a new Web site or virtual directory. By default, Internet Services Manager enables anonymous access, Digest authentication, and Integrated Windows authentication. Because this Web site or virtual directory is not to be used for access with an interactive browser, you should not enable multiple authentication methods, but instead configure the Web site or virtual directory to use an authentication method appropriate for your implementation.

Configuring the Data Pump

If you want to use the data pump library with a multiple-computer configuration, you must alter the msolap.asp file so that the data pump can be redirected to the corresponding Analysis server. This process should be completed after you construct a Web site in IIS for Analysis Services HTTP or HTTPS connectivity.

NOTE: The steps described in this section are not required for using the shared memory transport on single-computer configurations.

To alter your msolap.asp file, follow these steps:

CAUTION: You should first back up of your msolap.asp file. Make only the changes to the msolap.asp file indicated below.

Edit your msolap.asp file so that the ServerName property of the pump object refers to the appropriate Analysis server. The following sample displays the required change in bold:

<%@ LANGUAGE="VBSCRIPT"%>
<%' ************** Do not change this file ***************
' changing of this file can bring unexpected results
' NEVER add any HTML tags. Places that are allowed to
' change will be specified explicitly in the comments.
' ******************************************************
%>
<%Response.Expires = 0%>
<%Response.Buffer=FALSE%>
<%Server.ScriptTimeout=3600%>
<HTML><%
On Error Resume Next
Call ReadData
' This is error handling code and should not be modified
' This code will take care of the potentional errors in
' this asp page.
if ( Err.Number <> 0 ) Then
errstr = "<Error>" + CStr(-8) + "</Error>"
errstr = errstr + "<SysError>" + CStr(err.Number) +
"</SysError>"
errstr = errstr + "<Note>" + err.Description + "</Note>"
Response.AddHeader "Pump-Error", errstr
Response.Flush
Response.End
End if
Function ReadData
' ****** You can modify code of this function, but we don't
recommend doing it. ***************
if (isEmpty(Session("StoredPump"))) Then
Set pump = Server.CreateObject("PUPump.PUPump.1")
Set Session("StoredPump") = pump
else
Set pump = Session("StoredPump")
End if
' Replace <server_name> with the name ' of the destination Analysis server. pump.ServerName = "<server_name>"
' This value can be changed.
pump.Timeout=60
pump.ReadData
Response.Flush
Response.End
End Function
%>

Save the file and close Notepad.

For testing purposes, you can force the use of the named pipe transport on a single-computer configuration. To do so, replace "<server_name>" with "localhost" in the above example. To resume using the shared memory transport on a single-computer configuration, remove the section displayed in bold from the above sample, or set the ServerName property of the pump object to an empty string ("").

IMPORTANT: The named pipe transport should be used with single-computer configurations only for testing purposes. You are strongly recommended to use the shared memory transport for single-computer configurations in all other implementations.

Implementing the Data Pump

The data pump library does not require any direct interaction beyond the configuration steps described earlier in order to use it with Analysis Services client applications. However, there are several security-related considerations that should be reviewed before implementing the new data pump functionality in a production environment.

IIS and security

The security of your implementation largely depends on the proper configuration of IIS. IIS is key to ensuring that the appropriate user account is being used when accessing Analysis Services. As with other applications, IIS runs the data pump library in the security context of the user requesting access to the Analysis server. The virtual directory provides the security context for the data pump library for the purposes of authentication.

Anonymous access

You should use anonymous access for large applications (typically coming across the Internet) that have minimal resource overhead and have no authentication requirements. However, you must take into account these considerations when enabling anonymous access:

If enabled, all users accessing both the IIS server and the Analysis server look like the same user (that is, the IIS anonymous user account). There is no easy way to differentiate one user from another. Although anonymous access is very scalable from a resource perspective, it is next to impossible to track users from a security perspective.

When separating the IIS server from the Analysis server, modify the NT account that the IIS virtual directory uses for anonymous access. Rather than the default IUSR_computername account, use a domain account that is common to both servers, because IUSR_computername is a local computer account that does not have network access. The Analysis server needs a common domain account to impersonate when enforcing security (that is, when determining access to cubes, dimensions, and cells).

If a domain account cannot be created between the two servers, you can configure matching local computer accounts with identical user name and password combinations. Typically this workaround is only used when there is no trust relationship between either the servers or their domains. Administrators must furthermore keep these matching local computer accounts synchronized, either by writing scripts that use Windows Management Instrumentation to automate the process or by manually managing the matching accounts. For example, if a password expires on one server, the password must be changed for all other matching local computer accounts.

Because of its administrative overhead, this workaround is recommended only in those cases where none of the other approaches will satisfy implementation requirements.

Assign the IIS virtual directory anonymous user account to an appropriate role in Analysis Services. You should not assign the IIS virtual directory anonymous user account to the OLAP Administrators group in order to provide access to the Analysis server.

Integrated Windows authentication

You should use Integrated Windows authentication when you have controlled intranet access or Virtual Private Network (VPN) based Internet access. There are several issues to consider when using Integrated Windows authentication:

You should use a single sign-on (SSO) facility that is integrated with Windows. Whatever Windows user account the client is logged into will also be used for accessing Analysis Services; there is no special processing done by IIS. Users cannot be logged into Windows on their client computers and appear as a different user to Analysis Services. Using an SSO facility is comparable to using trusted connections in SQL Server.

Domain accounts should be employed as user accounts between the client computer and the IIS server. If domain accounts cannot be employed between the two systems, use the same workaround described earlier in "Anonymous Access." However, because of the high administrative overhead involved in keeping the two local computer accounts synchronized, this approach is impractical for more than a small number of accounts.

The underlying network architecture must support the exchange of domain account information. In an Internet environment, this typically means either not using a firewall or opening up the firewall to more IP ports than would normally be considered safe. If Internet access is required, configure a VPN from the client computer to the network on which the IIS server resides.

Unfortunately, when separating IIS and Analysis Services, Integrated Windows authentication cannot be used in most Windows domain architectures. The default NT4 security subsystem, sometimes referred to as NTLM, only allows a one-hop transfer of credentials. The security credentials that are transferred during the "hop" between the client and the server therefore cannot be re-used when the IIS server attempts to access Analysis Services.

This restriction is not limited to Analysis Services. SQL Server and Exchange have similar constraints. Although it is technically possible to support multiple hops, doing so requires considerably more domain infrastructure. Multiple hops are only supported when running Kerberos authentication. For more information about configuring Analysis Services to use Kerberos authentication, see Knowledge Base article Q817384, "How to Use Kerberos with Analysis Services."

Basic authentication

Use Basic authentication when you have controlled Internet access (that is, individual user accounts.) Although Basic authentication is the recommended method of implementing Web connectivity for Analysis Services, the authentication process and its implications are more complex (in addition to more useful) than the methods discussed earlier. You must take into account several considerations when using Basic authentication:

In IIS, the user is identified solely by the user name and password information provided in the connection string. The user account of the client computer is not used.

To provide some additional context for this issue, let's review SQL Server connectivity. With SQL Server authentication, the user name and password information provided in the connection string is used to authenticate the user against SQL Server logins. The client sends this information to the server, and the authentication is performed on the server.

With Analysis Services, Basic authentication with HTTP access provides a similar (albeit more complex) facility. Analysis Services only supports Integrated Windows authentication. When PTS receives a Basic authentication challenge sequence from IIS (during the initial HTTP protocol exchange), PTS parses the connection string and sends the user name and password contained in the connection string as the response. This initial exchange happens well before msolap.asp is executed by IIS. IIS uses those credentials to log the thread into the user account for later processing (when msolap.asp is invoked). As far as Analysis Services is concerned (whether running on a single-computer or multiple-computer configuration), the appropriate security environment for establishing the user's rights and permissions is the IIS logon, not the user account of the client computer. Because the IIS logon is being performed on the IIS server, it is considered a single hop from the IIS server to the Analysis server and is therefore allowed. For more information, see Integrated Windows Authentication.

When separating the IIS server from the Analysis server, user accounts should be domain accounts. If domain accounts cannot be employed between the two systems, use the workaround described in Anonymous Access. As with Integrated Windows authentication, this approach is impractical for more than a small number of accounts.

You should also use Secure Sockets Layer (SSL) to provide a secure connection when using Basic authentication. Basic authentication transmits user names and passwords in a lightly encrypted format. For more information about how to obtain and configure SSL, see the IIS documentation. Because the passwords are sent in an encrypted format, Digest authentication is more secure than Basic authentication, but it requires the domain controller to keep plain-text copies of passwords.

Analysis Services and Security

If the data pump library uses the named pipe transport, the Roles connection string property in PTS is ignored when client applications connect to msolap.asp. Analysis Services uses the role applicable to the user account provided by IIS to determine access to Analysis Services objects.

You can also open only one connection per process when using HTTP to connect to Analysis Services, because the data pump library only supports one HTTP session per process. If you attempt to open a second HTTP session for the same process, the session cookies for the first session are dropped, in effect closing the first connection to open the second connection.

Network Architecture and Security

Network architecture is also a key element to consider when implementing Web connectivity for Analysis Services. Network architecture can range from simple to complex, depending on the security and availability requirements of your enterprise. For example, the physical architecture diagrams provided in this paper for single-computer and multiple-computer configurations are themselves simplified to exclude such elements as gateways, routers, and cache servers.

Configuring client and server ports

One way that firewalls, routers, and other network security and management devices prevent unauthorized access is by disabling unused client and server ports. Depending on the protocols and authorization mechanisms used by your implementation, you need to enable certain client and server ports to allow the data pump library to communicate with both client applications and Analysis servers when using the named pipe transport.

The following table lists the client and server ports that must be enabled to use NetBIOS, the protocol used by the named pipe transport. Depending on your authentication mechanism, however, you will need to enable other ports to allow the data pump library to function correctly. For example, if you use NTLM authentication through your firewall, there are ports associated with the transfer of security tokens used by NTLM; if you are using Kerberos authentication, there are different ports associated with the transfer of Kerberos tickets.

Client port(s)

Server port

Service

1024-65535/TCP

135/TCP

RPC

137/UDP

137/UDP

NetBIOS Name

138/UDP

138/UDP

NetBIOS Netlogon and Browsing

1024-65535/TCP

139/TCP

NetBIOS Session

1024-65535/TCP

42/TCP

WINS Replication

A complete analysis of all required ports depends on the types of protocols and authentication mechanisms being used through the firewall, and is therefore beyond the scope of this paper. For more information about required ports for protocols and authentication mechanisms, see Windows documentation and the Microsoft Knowledge Base.

Limiting access with firewalls and perimeter networks

The product of security and availability is often treated as a constant: to increase security, you must decrease availability, and vice versa. However, this constant depends in part on network architecture. One way to increase this constant (thereby yielding greater security and availability) is through the careful use of firewalls and perimeter networks (also known as DMZs, demilitarized zones, and screened subnets) as part of your network architecture.

A firewall is a mechanism for controlling the flow of data between two parts of a network that are at different levels of trust. Firewalls can range from packet filters, which only allow traffic between specific IP ports or ranges of IP addresses, to application-level firewalls, which actually examine the content of the data and decide whether the data should flow or not. Different types of firewalls can be combined within a given network architecture; for example, sites often implement outward-facing firewalls that filter packets, in conjunction with inward-facing firewalls that filter at the protocol and port layers.

Good security policies, such as periodic password expiration; long passwords; passwords with symbols, numerals, and mixed-case letters; and the prohibition of easily guessed passwords, common words, and sequenced passwords.

Other physical components such as smart cards, and biometric products such as fingerprint readers and retina scanners.

Never rely on just one element, such as hardware firewalls, when designing your security architecture. You should have all four elements.

However, firewalls only manage the security part of the constant. To manage the availability part, firewalls are used in conjunction with separate computers to construct perimeter networks. A perimeter network is a single computer or small network inserted as a buffer between the internal network of a company and the external network available to the public. The perimeter network prevents outside users from getting direct access to a server that has company data. A perimeter network is a more secure firewall implementation and effectively acts as a proxy server.

There are many different perimeter network configurations. Discussing all of them in detail is beyond the scope of this paper, but we can review a few basic perimeter network configurations and show how to configure them to work with IIS and Analysis Services.

NOTE: The following configuration diagrams represent logical architectures for the specified scenarios. A production Web site would include additional, redundant components, such as fault-tolerant disk arrays, dual power supplies, dual NICs, redundant network links, failover backup servers, off-facility backup tape storage, and disaster planning (and occasional testing) to ensure that no single point of failure is part of the architecture.

Figure 4 illustrates a common perimeter network configuration for a public Internet site.

Figure 4. Perimeter network for a public Internet site

In this scenario, Analysis Services is separated from the IIS servers so that the IIS servers can access a shared database. This configuration provides load balancing for the Internet site: the site can manage requests across multiple IIS servers. This configuration assumes that the Analysis Services component is a small piece of the overall application and that the IIS servers are being used for the rest of the application. There is some security for data residing on the Analysis server, since there is a separate physical network (that is, dual NIC cards on the IIS servers).

By contrast, Figure 5 illustrates a common perimeter network configuration for a public Internet site with a private Analysis server. In this case, we assume that the customer is selling some specific kind of analysis data across the Internet. Thus additional security measures are needed beyond the anonymous access used above. We also assume that the customer has an existing private network for distribution of that data to the public Internet Web site. As you would expect, this has a much more complicated architecture.

Figure 5. Perimeter network for an Internet site with private Analysis server

In this scenario, the IIS servers are separated from the Analysis server so that the Analysis server can be secured on a private network. To limit possible exposure, many high-risk Internet sites will use an optional private firewall. The private firewall is configured so that only a limited number of computers using specific ports can directly move data between the Analysis server and the private network. Security credentials and domain controller traffic should be specifically disabled through the private firewall.

In this case, because the Web site is a syndicated data provider, we will be using Basic authentication for users to access the analysis data. This means that a user name and password must be passed on the connection string.

For example, the user may employ the Multidimensional Connection Wizard in Microsoft Excel, as shown in Figure 6.

Figure 6. Authentication using the Multidimensional Connection Wizard

Notice that we are using the HTTPS protocol for secure SSL encrypted access. This protocol requires the customer to purchase SSL server certificates from a trusted certificate authority, such as Verisign. Additional security measures can also be implemented. For example, the Web servers could be configured to render only SSL data and to also ensure that the client Web browser has a special X509 certificate installed. Both of these requirements can be verified by modifying the msolap.asp file to verify that the transport is HTTPS and that specific client X509 certificates exist.

When users connect to the Web server, they are authenticated and logged into the Web servers. These credentials are passed to the Analysis server through the internal firewall. The Analysis server must be a member server of the COMMERCE domain. Thus in addition to the ports needed for access between the Web servers and the Analysis server, the internal firewall must also allow ports and connections between the Analysis server and the COMMERCE domain controller.

To administer the COMMERCE domain, the architecture includes a workstation that is physically adjacent to the COMMERCE domain controller. Domain administrators will use this workstation to add, remove, and change authorized users of the Web site.

The previous scenario can be implemented in a variety of ways. Figure 7 illustrates an alternative approach.

Here, a dedicated VPN connection acts as the private network. Data still moves over the Internet, but it appears as a private Analysis server to the public Internet site.

Some enterprise-class firewalls, such as the Cisco PIX 515E, combine firewalls with VPN gateways, so the top portion of Figure 7 does not necessarily contain two boxes and a switch. Because of the VPN, additional security measures would include the use of a smart card for VPN authentication and the addition of Microsoft ISA Server on the Analysis server.

Notice that we changed the way in which domain authentication is performed. Rather than opening up additional ports and computer access through the internal firewall, we have two domain controllers and a private network between them (used just for domain account replication). There is a one-way trust relationship between the perimeter network and COMMERCE domains. The perimeter network trusts COMMERCE. Thus, any Basic authentication login for COMMERCE\<user name> will be valid and can be impersonated on the perimeter network Web servers. There is also a firewall between the perimeter network and COMMERCE domain controllers limiting the ports to just those needed for domain controller replication.

Because individual computers will be using the VPN to enter the network, we removed the private firewall from the previous architecture. The architecture has the domain controller included on the VPN side of the firewall, because remote users will probably be maintaining the COMMERCE domain (for example, adding and removing users).

In this scenario, since we are using Integrated Windows security, we cannot easily separate IIS from the Analysis server. They should run on the same computer, because of the one-hop restriction of NT security tokens discussed earlier in this paper.

This scenario uses the VPN gateway to ensure that all of the required ports are accessible through the public Internet. This technique also provides full data encryption of the underlying data stream.

Conclusion

The updates included in Service Pack 3 extend the scalability and functionality of Analysis Services, allowing for secure multidimensional data access on the Web by physically and logically isolating the data pump from the Analysis server.

For More Information

SQL Server Books Online contains more information about Analysis Services. For additional information, see these resources: