Lab
10.3.7.2: Optimizing
Frame Relay Traffic Using Compression
Objective:
To optimize the Frame Relay traffic
between the Houston router and the Orlando router, using compression
methods.
Scenario:
Our company has analyzed the
bandwidth usage on our Houston to Orlando Frame Relay connection. It
has been found that we are using the Frame Relay line to the
maximum. There is not enough money in the budget to upgrade the
line. It is our responsibility to find a way to move more
information across the link. Our solution: Compression.
Lab Tasks:
Cable the lab as shown in the
diagram.
The next task is to set up the
routers and frame relay connection as in previous labs:
Configure IGRP routing with the
AS# 777.
Configure the correct IP addresses
on each of the routers.
Configure the encapsulation type,
DLCI, LMI type, and any frame relay maps.
What is the command for setting the
frame relay DLCI for this interface?
Don't forget the
no shutdown command
on the interfaces.
Before configuring compression on
the routers, test your configurations by pinging the other router.
This way you are insuring that your basic configuration is correct
before changing it.
The next step before adding
compression is to baseline how much information can flow across the
link. To do this, you will need a FlukeŽ meter capable of
generating network traffic and throughput tests, or some type of
network testing software that is capable of measuring line
throughput. Execute several tests on your Frame Relay network, to
give you a baseline of current Frame Relay link performance.
In the space below enter your
findings on the current network performance.
The final step is to configure
compression on the WAN interface.
Houston(config-if)#frame-relay map
ip 172.17.17.17 18 payload-compress frf9 stac
Orlando(config-if)# frame-relay map ip 172.17.17.18 17
payload-compress frf9 stac
Are there any other types of
compression available? If so what are they?
Rerun your network performance
tests. In the space below enter your findings of your new network
performance.