Benchmarking tools enhance wireless test and measurement for competitive analysis
Benchmarking tools are not only a means of rectifying specific network problems and ensuring ongoing quality. They also provide documentation of competitive advantage over similar wireless services.
Competitive benchmarking tools are a new addition to the test and measurement arsenals of wireless carriers. These tools allow carriers to measure their quality of service (QOS) against competing cellular, personal communications services (PCS) and enhanced specialized mobile radio (ESMR) carriers in their geographic markets.
What is competitive QOS? The importance of competitive QOS has grown as the wireless industry has matured and new licenses have been granted or auctioned to aggressive new carriers. QOS will separate the winners from the losers in the competitive wireless industry.
Before the turn of the millennium, high-quality audio performance from wireless systems will become a competitive necessity, something that subscribers will take for granted. Operators who fail to deliver acceptable audio quality performance-as measured by audio quality and other metrics-will rapidly lose customers to any of several competitors, be they traditional cellular carriers or the new breed of PCS, ESMR, wireless local loop (WLL) and mobile satellite service (MSS) providers.
The subscriber’s ear is the final judge of any wireless telephone system. Customers do not care about network technology and engineering issues, such as time-division multiple-access (TDMA) vs. code-division multiple-access (CDMA). They just care about performance. They will have every incentive to switch competing wireless service providers, technology notwithstanding, in the same way that they already change their landline long-distance providers.
This means that wireless operators will have to increasingly focus on subscriber- perceptions of quality. They will need to concentrate their efforts on areas of competitive weakness, such as the incidence of noise and distortions due to radio frequency (RF) phenomena. Quality-related systems test, measurement and optimization activities will grow in importance.
QOS is a composite metric made up of several call-related factors that contribute directly to end-user satisfaction.
Call initiation – When the subscriber hits “send,” does the call go through? There are three possible results from hitting the “send” key: * no-service areas – The call does not go through, and the subscriber has to wait until he or she reaches another coverage area. * blocked calls – The call does not go through, and the subscriber has to hit the “send” key a second time. * good initiations – The call goes through.
Call access – Assuming a good initiation, how long did it take for the system to grant the subscriber access to a voice channel? This is not a critical factor unless access times exceed 15 seconds, at which point the subscriber starts to wonder if the phone is working properly.
Call quality – Once the call has gone through, how clear, intelligible and distortion-free is the audio transmitted between both ends of the connection? Graded on an MOS (mean opinion score) basis, there are five possible scores: 5 = excellent (no distortion detected). 4 = good (a little distortion detected; not annoying to the listener). 3 = fair (some distortion; slightly annoying). 2 = poor (significant distortion; annoying). 1 = bad (almost unintelligible; very annoying).
Call completion – Assuming that audio quality is good enough (and does not annoy callers to the point where they abandon the call in frustration), was the call completed successfully? A call can end for the subscriber in two ways: * normal end – The call ends on demand, when the caller presses the “end” key or replaces “on-hook.” * dropped calls – The system drops the call before the customer is ready, such as through faulty hand-offs.
Each carrier (indeed, each subscriber) will apply different weightings to these various QOS factors, depending on background, expectations, applications and the current state of wireless technologies.
The quality bar on all of these parameters will only get higher over time. Customers have come to expect continual improvements in wireless system performance in voice quality, due to technological advances and fierce competition. As wireless systems mature, customers will expect wireless voice quality on a par with traditional wireline telephone service.
Troubleshooting QOS problems QOS and user satisfaction not only depend on the expectations of subscribers, but also on the intricate, end-to-end network infrastructure that supports a call. The perceived quality of audio transmissions in a wireless network depends on the performance of every system and component between the people speaking, including every switch, trunk, line, cross-connect and radio-frequency (RF) channel. It also depends on the quality of the subscribers’ mobile terminals and how those phones are used.
When analyzing the quality of a given voice call, one must consider the technical environment specific to that call. Most calls placed from or to cellphones involve connections with the landline public switched telephone network (PSTN), as shown in Figure 1 on page 16.
Audio quality problems can be introduced at any of the following nodes or circuits: * landline phone. * inside wiring-including jacks, horizontal cabling, punchdown blocks and cross-connects-connecting the landline phone to an office private branch exchange (PBX). * PBX, including trunks connecting to the PSTN. * PSTN, including trunks connecting to the cellular operator’s mobile telephone switching office (MTSO). * MTSO. * base station controller (BSC) or mobile system controller (MSC). * voice coder. * base station. * RF channel. * cellphone.
On this end-to-end call configuration, the only components typically under the wireless service provider’s control are the MTSO, BSC or MSC, voice coder, base station, intranetwork trunk routes and, to a lesser extent, the RF channel. Service providers have little control over the type or quality of terminal equipment employed by their subscribers or over the subscriber’s usage patterns or habits.
In most operational wireless networks, performance of infrastructure hardware-the MTSO, coder, trunks and base stations-is fairly stable. The basic network architecture is modified slowly over time, usually in a tightly controlled fashion subject to thorough testing and quality assurance. Consequently, infrastructure establishes a baseline quality to which customers, theoretically, usually adjust their expectations.
Isolating RF channel QOS problems The one QOS factor that fluctuates constantly is the RF channel. Its reliability is affected by dynamic atmospheric, morphologic, geologic, hydrologic and traffic conditions present between fixed base stations and roving users. Fading, multipath, interference, noise and hand-offs can degrade audio quality and cause it to fluctuate widely sometimes within a given phone call. One could argue that the bulk of users’ reported audio quality problems on wireless networks can be attributed to dynamic RF conditions that were not foreseen or addressed adequately in the network design.
Many audio problems are caused or aggravated by RF factors and are commonly experienced in calls between landline and wireless telephone systems. The problems include: fading, static, noise, distortion, cross-talk, interference, clipping, echo and delay.
Wireless service providers usually address RF-related audio problems through regular modifications to their RF network design, such as adding base stations and repeaters, subdividing cells and sectors, and changing the tilt and beamwidth of transmitters.
No wireless carrier can claim to have eliminated RF-related QOS deficiencies from every hill, dale and office park in its market. Carriers do their best to maximize coverage, keep unwanted interference at bay, and minimize bit-error rates and dropped handoffs, but RF links are notoriously trouble-plagued. Bedeviled by tricky terrain, volatile weather conditions, and the ever-shifting landscape of fixed and mobile obstructions, RF propagation modeling is an inexact science-an engineer’s rolling headache of network remodeling, retesting and reconfiguration. Each wireless carrier focuses on addressing a “Swiss cheese” pattern of weak zones in its system while searching for similar deficiencies in competitors’ networks that can be called to a customer’s attention.
Competitive benchmarking tools Into this intensifying dogfight step the competitive benchmarking tools. To design our tool system, Benchmarker, we identified three fundamental tasks the system would have to perform: emulation, comparison and analysis.
*Emulate the subscriber – The benchmarking tool should automatically place and receive wireless phone calls; take QOS measurements on those calls (initiation, access, quality and completion factors); geolocate QOS measurements with a navigation technology, such as GPS; and log measurements with navigation coordinates.
*Compare access protocols, bands, and service providers – The tool should place calls and take QOS measurements on all wireless access protocols, bands, and service providers in a given geographic market. This would enable “apples-to-apples” QOS comparisons across different service providers implementing different technologies. It should be possible to place multiple calls concurrently from the same mobile benchmarking tool to as many as eight wireless service providers, reflecting the number of available cellular, PCS, and ESMR (and, soon, mobile satellite) service providers available in a given geographic market. A full-fledged competitive benchmarking tool should support all of the following common digital and analog wireless protocols: AMPS, IS-54B, IS-136 (800MHz and 1.9 GHz), IS-95 (800MHz and 1.9 GHz), GSM (900MHz, 1.8GHz and 1.9GHz) and Iden.
*Analyze the data – The tool should come with software that produces detailed management level reports, statistical graphs and location plots. Using these outputs to identify competitive strengths and weaknesses, a wireless carrier can then use the engineering data collected simultaneously with the competitive data to locate and resolve problems quickly.
Benchmarking tool architecture A full-featured, competitive benchmarking tool requires mobile, landline, data collection and audio quality components.
*Mobile unit – This unit, typically installed in a “drive-test” vehicle, consists of multiple portable wireless phones and scan receivers covering all protocols and bands to be benchmarked; a navigation system, such as GPS or Travelpilot, to correlate measurement data with latitude and longitude coordinates; and a portable computer to set up measurement sessions, autodial the unit’s cellphones, store collected QOS metric data and view data as it is being collected. Special mounting hardware is needed to seat the unit’s chassis, and all portable phones, in the vehicle’s trunk. The mobile unit draws power through a direct connection to the vehicle’s battery.
*Landline unit – This unit, usually installed at a carrier’s network operations center, MSC or MTSO, consists of landline phones that automatically (via computer scripts) dial or answer calls to and from the cellphones in the mobile unit. This simulates the typical mobile-to-landline configuration of most wireless calls. Whereas the mobile unit collects QOS data on the downlink, the landline unit collects corresponding data on the uplink of a call. Time synchronization (“handshaking”) between the landline and mobile units is necessary to merge and correlate the resulting data to produce a composite portrait of wireless QOS.
*Data collection, analysis, post-processing and reporting – This requires software with geographic information system (GIS) capabilities. As previously discussed, this software should support collection, correlation, analysis and display of QOS data from uplink and downlink QOS data.
*Audio quality measurement – Both the mobile and the landline units should contain audio quality measurement modules, usually based on digital signal processing (DSP) technology, that produce MOS-like ratings of downlink and uplink audio quality on calls.
Audio quality assessment technology LCC’s audio quality assessment module, called Auryst (aural analyst) assesses acceptable voice quality measurements in both digital and analog wireless systems based on validated user audio perception. The module measures wireless network voice quality as it would be perceived by the average listener. The module generates audio quality scores (AQS) on a scale of 1 to 5, as previously discussed, that are highly correlated (>95%) to known differential MOS (DMOS) data validated by Comsat Laboratories, which performs listener evaluations.
The audio quality module is based on an LCC-developed model (patent pending) of auditory perception that is derived from wavelet decomposition theory and accounts for auditory nerve-firing rates. The model uses critical band analysis and focuses on elements of speech and sound that are critical to user perception, such as frequency sensitivity, time resolution, masking phenomenon and intensity.
The audio quality module uses built-in reference speech samples-male and female-that are transmitted through the wireless system to establish a standard of acceptable voice quality. Speech samples are of phonetically balanced Harvard sentence-pairs to ensure that the audio quality measurements are taken on a representative phonemic sampling. Using high-speed DSP technology and a custom filter bank, the module can analyze and quantify distortion between reference and impaired samples in real time. Alternately, this information can be collected for mapping, post-processing, analysis and reporting in a Windows-based, post-processing application.
The module is used to generate “source-referenced” AQS metrics, which are analogous to source-reference DMOS. Source-reference AQS measures RF- and coder-impaired speech against “clean,” pre-coder source speech. This is useful for characterizing the quality, including coder- and RF-related impairments, of calls on a single wireless system. Figure 2 on page 26 illustrates how the module computes an AQS.
AQS is measured in simplex mode, on either the forward or reverse channel, at any point in time during measurement. In addition to AQS, the audio quality module measures various follow-call parameters and call statistics, such as the incidence of call setup failures, busy signals, dropped calls, hand-off exceptions and completions.
The audio quality measurement module is designed to work with the leading digital and analog wireless protocols. RF engineers can receive and merge uplink and downlink measurements for audio quality, thereby supporting assessment of the baseline quality of an entire system. Moreover, engineers can use the technology to compare the quality of their service with that of competitors-regardless of which technology the competition uses.
The Benchmarker system measures audio quality on as many as eight digital or analog networks simultaneously. The landline audio quality measurement module performs uplink quality measurements on tones received from the mobile unit, while the mobile devices measure downlink audio in similar fashion, by measuring audio or tones transmitted from the landline unit.
Test calls can be initiated and answered automatically by either the landline or mobile unit. The landline unit module is capable of unattended, script-driven operation for each analyzer that controls calling and answering, and it can be located for call termination at a mobile switching center or at a standard public-switched telephone network (PSTN) line interface. The landline unit supports an interface to the analog North American PSTN access loops and to international digital phone systems.
Ongoing benchmarking Competitive benchmarking is an important activity that is of common interest to engineering, marketing, and executive-level personnel. Benchmarking is often used to set engineering goals and guide network expansion and optimization activities. Benchmarking must be based on valid methodologies that ensure that resultant data is clear and easy to interpret by all parties. To work through the complexities of a benchmarking operation, carriers should expect the benchmarking provider to offer tools, methodology and training.
*Benchmarking tool – The tool must support all of the carrier’s benchmarking requirements, as discussed above, and be easy to install, use, administer and maintain by engineering staff. Ideally, the tool should output data to RF post-processing and analysis packages that carrier personnel are already using.
*Methodology development – Without appropriate data collection and analysis procedures, the competitive benchmarking results obtained can be extremely misleading. Benchmarking procedures must mesh with the carrier’s system engineering and field operations and be capable of being carried out consistently, over time and across markets, by carrier personnel and engineering consultants. Better results can be obtained from benchmarking tools when they are bundled with consulting services that show the carrier’s engineering staff how to make the most effective use of the equipment, analyze the benchmark data, and use those analyses to modify and optimize their systems.
*Documentation, training, and support – Benchmarking tools and operational procedures must be presented to carrier staff through intensive, upfront training. The provider must be able to supply ongoing remedial training and in-service support. Well-organized, easy-to-read course materials, user manuals, application notes and service bulletins should be provided.
Effectively implementing a complex measurement system requires a carrier to rethink how it monitors its RF networks and to reconfigure support systems accordingly. The measurement system should include a landline unit, a mobile unit, and data collection, analysis, post-processing and reporting software with GIS capabilities.
Conclusion Competitive benchmarking solutions provide the hard QOS data demanded by today’s maturing, competitive wireless industry. With this data, wireless operators can perform intelligent optimization of their networks. Instead of simply looking for problem areas in their networks, they can look for areas of competitive weakness. In this way they can better prioritize which network problems need to be addressed and better answer the question of “How good is good enough?”
Supplying the data “ammunition” for quality “shoot-outs” between rival carriers, these tools are catalysts for raising industry-wide QOS levels to new heights. Subscribers-and the industry as a whole-can only benefit from this renewed attention to service quality.
Kobielus and Woessner are product managers for LCC International, McLean VA. Email: [email protected] or [email protected].