36 Discuss the strategies implemented for resource optimization.
For the networks of today, speed is a crucial factor. Networks are congested with traffic and bandwidth is limited. It is necessary to have policies ensuring the optimization of the bandwidth. These policies and strategies collectively are known as QoS. The following are a part of QoS:
- Traffic Shaping
- Load Balancing
Learning about them in detail:
QoS: Quality of Service or QoS is the term used to describe the strategies involved in managing and increasing the flow of traffic. The administrators are able to predict and monitor the use of bandwidth and ensure its availability across the network for applications which require it. These applications can be broken down as:
- Latency Sensitive: These applications are demanding when it comes to bandwidth as lag time affects their efficacy.
- Latency Insensitive: Managing latency insensitive applications is also a part of managing bandwidth. Bulk data transfers are latency insensitive transfers.
Bandwidth is a limited resource with networks and the traffic increasing day by day. Latency sensitive traffic is demanding when it comes to bandwidth. It is best to prioritize the traffic to ensure that traffic gets delivered on time. QoS plays an important role by ensuring that the applications like video conferencing do not affect traffic throughput negatively. It queues the traffic depending on the time at which it has to be delivered.
Latency-Sensitive High-Bandwidth Applications: Many applications are highly demanding when it comes to bandwidth. The high bandwidth applications are: VoIP and Video Applications.
Voice over Internet Protocol (VoIP): This application is used for transferring human voice over the internet. This is done using data packets. The voice is converted to the digital form and encapsulated in the form of data packets. When it reaches the destination it is reconverted to the original form. VoIP is being used as an alternative to PSTN, that is, public switched telephone network. While using VoIP, data, images, graphs and videos can also be transmitted unlike the PSTN.
The protocol that is used by this application is RTP (Real Time Protocol) which is used with UDP-IP packets. UDP is a fire and forget protocol ensuring fast delivery at the destination. The issues that go against this application are latency and security. To deal with the concern of security, RTP has been improved and a Secure RTP version has been launched.
Video Applications: The role of internet is not only increasing in businesses but also in the world of entertainment. It is being used to watch television shows, news broadcasts, videos etc. To view this content in a speedy fashion a lot has to be done. Protocols taking care of speed have to be working as videos are latency sensitive. The protocols that can be used are TTCP and UDP. The three protocols that are commonly used are: RTP (Real Time Protocol), RTSP (Real Time Streaming Protocol) and RTCP (Real Time Control Protocol)
Traffic Shaping: It is a QoS strategy and works on prioritization of the transmission. It aims at reducing latency. This is done by regulating the quantum of data moving in and out of the network. The network policy decides how data will be categorized, queued and directed. Several strategies are used to regulate the data. The common methods that are followed are:
- By Application: In this the traffic is shaped by categorizing the traffic on the basis of its types and assigning a bandwidth limit. FTP can be used to categorize traffic and specifying that no more than 4Mbps will be dedicated for FTP traffic.
- Network traffic per user: Traffic sharing can also be done on per user basis, that is, bandwidth is allocated amongst the users. This actually does not limit the content but limits the speed at which the content can be viewed.
- Priority Queuing: This is queuing the traffic depending on how important is the traffic to the purposes for which the network has been set up. For example, In an academic network, use of the network for recreational purposes can be limited.
Load Balancing: As the demands on an organizations servers and systems increases so does the load on that server. Load balancing is a strategy adopted to distribute the load amongst the different networked systems. The process of distribution is termed as server farm. It results in sharing of demand between multiple CPUs, network links, and hard disks. The results are increased response time, distributed processing and optimal resource utilization. It is up to the server farms to ensure delivery on internet services. Websites which are considered high performance rely on server farms for scalability, reliability and low latency.
Caching Engines: Caching is considered important when it comes to working on optimizing network traffic. It is used by proxy servers to limit the client requests being sent to the internet. While caching with the proxy server, a copy of the requested page is maintained in the cache area and on a subsequent request from the same or a different client on the same network is made, the copy is made available to the user rather than going back to the net. It goes a long way in reducing the traffic that is filtered to the internet and results in gains for the network. The administrators consider the following while deciding what to cache:
- What sites are to be added to the cache;
- How long will the information be stored in the cache;
- How often is the cached information to be updated;
- What will be the size of the cached information;
- What type of content is to be put in the cache;
- Who are authorized to access the cache?
The advantages of using caching are:
- Increased Performance: Cached information is stored on the local systems which are closer to the user system. This ensures retrieval of information at a faster rate.
- Data Availability: There can be situation where the data or applications being accessed are unavailable for reasons of failures. In such situations information stored in the cache area proves useful.