Proposed Methodology for Connect America High-Cost Universal Service Support Recipients To Measure and Report Speed and Latency Performance to Fixed Locations

Federal Register, Volume 79 Issue 224 (Thursday, November 20, 2014)

Federal Register Volume 79, Number 224 (Thursday, November 20, 2014)

Proposed Rules

Pages 69091-69095

From the Federal Register Online via the Government Printing Office www.gpo.gov

FR Doc No: 2014-27429

=======================================================================

-----------------------------------------------------------------------

FEDERAL COMMUNICATIONS COMMISSION

47 CFR Part 54

WC Docket No. 10-90; DA 14-1499

Proposed Methodology for Connect America High-Cost Universal Service Support Recipients To Measure and Report Speed and Latency Performance to Fixed Locations

AGENCY: Federal Communications Commission.

ACTION: Proposed rule.

-----------------------------------------------------------------------

SUMMARY: In this document, the Wireline Competition Bureau, the Wireless Telecommunications Bureau, and the Office of Engineering and Technology seek to further develop the record on how compliance with speed obligations should be determined for recipients of high-cost support that deploy broadband networks to serve fixed locations.

DATES: Comments due December 22, 2014.

ADDRESSES: Interested parties may file comments on or before December 22, 2014. All pleadings are to reference WC Docket No. 10-90. Comments may be filed using the Commission's Electronic Comment Filing System (ECFS) or by filing paper copies, by any of the following methods:

Electronic Filers: Comments may be filed electronically using the Internet by accessing the ECFS: http://fjallfoss.fcc.gov/ecfs2/.

Paper Filers: Parties who choose to file by paper must file an original and one copy of each filing.

People with Disabilities: To request materials in accessible formats for people with disabilities (Braille, large print, electronic files, audio format), send an email to fcc504@fcc.gov or call the Consumer & Governmental Affairs Bureau at (202) 418-0530 (voice), (202) 418-0432 (tty).

For detailed instructions for submitting comments and additional information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document.

FOR FURTHER INFORMATION CONTACT: Alexander Minard, Wireline Competition Bureau at (202) 418-7400 or TTY (202) 418-0484.

SUPPLEMENTARY INFORMATION: This is a synopsis of the Wireline Competition Bureau's Public Notice (Notice) in WC Docket No. 10-90; DA 14-1499, released October 16, 2014. The complete text of this document is available for inspection and copying during normal business hours in the FCC Reference Information Center, Portals II, 445 12th Street SW., Room CY-A257, Washington, DC 20554. The document may also be purchased from the Commission's duplicating contractor, Best Copy and Printing, Inc., 445 12th Street SW., Room CY-B402, Washington, DC 20554, telephone (800) 378-3160 or (202) 863-2893, facsimile (202) 863-2898, or via Internet at http://www.bcpiweb.com.

  1. Introduction

    1. In this document, the Wireline Competition Bureau, the Wireless Telecommunications Bureau, and the Office of Engineering and Technology (together, the Bureaus) seek to further develop the record on how compliance with speed (also referred to as bandwidth) obligations should be determined for recipients of high-cost support that deploy broadband networks to serve fixed locations. In addition, the Bureaus seek comment on whether the same testing methodologies adopted for price cap carriers accepting model-based Phase II support should be applied to other recipients of support to serve fixed locations, such as rate-of-return providers and those that are awarded Connect America support through a competitive bidding process. Finally, the Bureaus seek comment on the circumstances that would trigger an audit of the speed and latency metrics.

  2. Measuring Compliance With Service Obligations

    1. Speed Performance Measurement

      1. The record received in response to the 2011 USF/ICC Transformation Order and Further Notice of Proposed Rulemaking, 76 FR 73830, November 29, 2011 and 76 FR 78384, December 16, 2011, on the methodology to be implemented for testing compliance with service obligations was not well developed. The Bureaus now seek to refresh the record on the methodology to be used for demonstrating compliance with the speed obligation for ETCs that receive high cost support to deploy broadband networks to fixed locations. Should internal network management system (NMS) tools be used to measure speed performance? Alternatively, should external measurement tools such as Speedtest/

        Ookla or Network Diagnostic Tests (NDT) by M-Labs? Are there better and more reliable methods of measuring speed?

      2. Internal NMS tools vary among providers. How can the Commission ensure that internal NMS tool measurements are valid? Will such tools account for multiple transmission

        Page 69092

        control protocol (TCP) streams, TCP window sizes, TCP slow start, and other factors in speed measurement? How would measurements from such tools be verified? Are these types of tools too burdensome or complex for speed measurements? Would such tools have any effect on customer service if used during peak periods? If external testing is adopted, how would measurements be verified? Are there better external measurement tools than those identified above?

      3. What testing parameters should be used for speed testing? Should they be different for internal and external testing?

      4. What testing parameters should be used to measure broadband performance for wireless providers offering service at a given address? Should the testing parameters be different if the service utilizes a fixed attachment to the building?

      5. The Bureaus propose to require all ETCs subject to broadband performance obligations to serve fixed locations to utilize testing parameters for speed similar to those already adopted for latency for price cap carriers. Specifically, the Bureaus propose to adopt a methodology that would require measurements to be made once hourly during peak periods, 7:00 p.m. to 11:00 p.m. daily local time, over four consecutive weeks, require 95 percent of the observations to be at or above the specified minimum speed, define the endpoints for the measurement as the customer premises to Commission-designated IXP locations, require testing to occur at least annually, and require a minimum of 50 randomly selected customers locations to be tested within the geographic area being funded in a given state. To the extent parties argue that the process adopted for latency testing be adjusted and used for speed testing, they should describe with specificity what changes should be made. The Bureaus also seek comment on whether the data usage in the proposed tests would have a significant effect on consumers and, if so, how such effects could be mitigated. Should any data caps or monthly usage limits be adjusted to prevent the testing from affecting consumers?

      6. The Bureaus propose to allow ETCs, including but not limited to price cap carriers, the option of testing compliance with speed requirements through the MBA program, similar to what WCB adopted for latency obligations. If the Bureaus were to do so, could they apply the same conditions and parameters as adopted for latency testing? Would any changes be needed?

      7. Should the testing options and parameters be the same for rate-

        of-return carriers and providers awarded support through the Phase II competitive bidding process as for price cap carriers? If not, what should they be and why?

      8. The Bureaus seek to augment the record received in response to the 2011 USF/ICC Transformation Order and FNPRM based on the considerations outlined above. Specifically, parties such as AT&T and Alaska Communications Systems argued that the testing mechanism should not require measuring service at all end-user locations. A testing mechanism for speed similar to that adopted for latency would only require testing at a certain number of locations. Frontier advocated that the Commission provide a choice of measurement test options. A speed-testing mechanism similar to that adopted for latency would provide two options for testing. A number of rural associations stated that the Commission should not impose measurement requirements until technically feasible, less burdensome testing procedures were available. A speed testing mechanism similar to that adopted for latency should be easily manageable for even very small carriers. The Bureaus seek comment on these tentative conclusions.

    2. Latency Performance Testing for Rate-of-Return Carriers and Providers Awarded Connect America Support Through Competitive Bidding

      1. The Bureaus seek comment on whether the two methods adopted to test price cap carrier compliance with latency service obligations should also be used to test compliance with latency service obligations for other recipients of high-cost support with a broadband public interest obligation to serve fixed locations. If so, should the testing parameters be the same for rate-of-return providers and those that are awarded Phase II support through a competitive bidding process as adopted for price cap carriers? If not, what should those parameters be and why?

      2. The latency-testing options adopted for price cap carriers should provide at least one readily achievable method suitable for small, rural carriers. The Bureaus seek comment on this tentative conclusion. In response to the 2011 USF/ICC Transformation FNPRM, rural carriers argued that broadband performance should only be measured for those portions of the network controlled by the provider or its commonly-controlled affiliates. The Bureaus note that in the Phase II Price Cap Order, 78 FR 70881, November 27, 2013, WCB rejected this argument for price cap carriers because (1) testing only part of the network will not demonstrate the quality of service being provided to the end user and (2) carriers have a number of options to influence the quality of service from their transit and/or peering providers. Would that same reasoning be applicable to other providers, such as rate-of-

      return carriers and non-traditional providers that may receive support through a competitive bidding process?

    3. Use of MBA Program for Testing and Reporting

      1. The MBA program developed out of a recommendation by the National Broadband Plan to improve the availability of information for consumers about their broadband service. The program examines service offerings from the largest broadband providers--which collectively account for over 80 percent of all U.S. wireline broadband connections--using automated, direct measurements of broadband performance delivered to the homes of thousands of volunteer broadband subscribers. The methodology for the program focuses on measuring broadband performance of an Internet service provider's network, specifically performance from the consumer Internet access point, or consumer gateway, to a close major Internet gateway point. A collaborative process involving Commission staff, industry representatives, and academics was used to determine the test suite and operations for the MBA program.

      2. The MBA program uses whiteboxes deployed to individual consumers, called panelists, to collect data on service levels. These whiteboxes perform periodic tests to determine the speed and latency of the service at a particular panelist's location, and the results of the tests are automatically sent to and recorded by an independent vendor. Panelists are selected via a process that allows for consumer registration and verification by the service provider followed by activation as a testing panelist. More than 13,000 whiteboxes have been shipped since the MBA program began.

      3. Currently, the MBA program tests wireline offerings of 15 large broadband providers and one satellite-based provider. If the Bureaus were to adopt a regime in which ETCs subject to broadband public interest obligations could demonstrate compliance with broadband testing requirements through their MBA results, would that encourage additional providers, including smaller providers, to seek to join the MBA? Could the MBA accommodate a large

        Page 69093

        number of additional participants? Is it feasible for smaller providers to participate in the MBA, particularly if they must pay the administrative and hardware costs of the whiteboxes? Are these costs likely to be greater or less than the cost of performing ping-type tests from 50 locations for latency and the testing that will be required to verify speed? Would allowing additional providers to join the MBA provide more detailed and more accurate information on provider performance at lower total cost?

      4. If additional providers join the MBA program for performance testing, should their data be make public and reported in the annual MBA reports as is done for other MBA providers? Should the MBA program consider creating a separate category of membership for providers that want to limit testing to Connect America-supported areas?

      5. The Bureaus seek comment on these and any other issues surrounding additional provider participation in the MBA program.

    4. Commission-Developed Testing Mechanism

      1. In the event that joining the MBA program proves infeasible for additional providers, the Bureaus seek comment on whether the Commission should implement a performance testing platform specifically for Connect America-supported broadband services. One possibility is to implement an oversight mechanism that would be similar to the MBA program. Like the MBA program, this could be a hardware-based test infrastructure administered by one or more service vendors with whiteboxes deployed to consumers throughout Connect America-supported areas. Having a single entity, such as USAC, procure the necessary vendor and infrastructure to administer this program would minimize the overall cost of the program as well as the costs to participating providers. The Bureaus seek comment on whether such a program would be feasible. If so, should it be similar to the MBA program, or is there a better way to measure broadband performance?

      2. If the Commission were to implement such a testing mechanism, should all ETCs subject to broadband public interest obligations to serve fixed locations be required to participate? To the extent commenters argue that any ETCs should be exempt, they should identify with specificity the costs and benefits of requiring them to participate, and identify alternative means of achieving the Commission's oversight objectives.

      3. The Bureaus estimate that the total costs for an MBA-type performance oversight program for ETCs receiving high-cost support to serve fixed locations would be approximately $4.2 million, which would include the necessary hardware and software as well as an initial allocation of 5,000 whiteboxes, in the first year and approximately $5.9 million each year thereafter (which incorporates an additional 5,000 whiteboxes per year). Our total cost calculation was based on the following estimates:

        ------------------------------------------------------------------------

        Year 1

        expenses Annual expenses after

        (millions) year 1 (millions)

        ------------------------------------------------------------------------

        Whiteboxes (client testing $1.2........... $1

        devices).

        Core Servers................. 1.7............ 1.65

        Program Administrative 1.3............ 1.3

        Expenses (could be performed

        by USAC).

        ------------------------------------------

        Total Cost............... 4.2............ 3.9

        ------------------------------------------------------------------------

        The cost estimates above are based on having a single entity contract for the necessary hardware and services to minimize costs through streamlined administration and bulk hardware purchases. If the Commission were to implement such a centralized testing program, should these costs be borne by participating providers or by USAC as part of its oversight over the universal service fund? Should USAC pay the costs of the core servers, with participating providers paying the costs of the whiteboxes deployed in their service areas? If USAC were to pay all of the equipment costs, including the whiteboxes, the Bureaus anticipate that the only cost for providers would be primarily to verify the services of the panelists selected in a particular provider's service territory.

      4. If the Commission were to adopt such an approach, how many whiteboxes should be deployed in each supported area? Should the number be the same for all providers, vary based on the number of customers in the supported area, or be based on some other calculation? Should individual consumers or consumer groups located in areas served by a Connect America-supported provider be allowed to participate in such an MBA-type mechanism by purchasing their own whiteboxes? Such ``citizen testing'' would allow interested individuals to evaluate the quality of their services while providing additional testing data.

      5. The Bureaus seek comment on the initial performance measurement test suite that should be used, if the Commission were to implement an MBA-type testing mechanism. The MBA's current test suite includes 13 tests that measure various aspects of network performance with respect to speed and latency and was developed on a consensus basis by academics, regulators, and industry participants. Would the MBA's test suite be an appropriate for a Connect America testing mechanism, or could it be modified in some fashion? What aspects of the MBA test suite are necessary to meet the Commission's objectives that ETCs meet their broadband public interest obligations?

      6. The MBA program has found that allowing consumers with whiteboxes (referred to as panelists) access to their testing data is an incentive to obtaining a high number of volunteers. Should a Commission-designed testing mechanism for high-cost recipients allow end user participants access to their own testing data? MBA results are currently made publically available via the Commission's Web site. Should the Commission publish test results? Making such data public would allow consumers and policy makers to evaluate whether ETCs are meeting their service obligations and allow comparisons of service quality among providers. Is there any reason that such performance results should be kept confidential? If so, should the results be treated as confidential for a particular period of time?

        Page 69094

  3. Auditing Speed and Latency

    1. In the USF/ICC Transformation Order, the Commission concluded that the results of speed and latency metric testing ``will be subject to audit.'' The Bureaus seek to further develop the record on procedures for implementing this requirement for all recipients of Connect America funding. In particular, the Bureaus seek comment on how to incorporate this requirement into the existing Beneficiary Compliance Audit Program (BCAP), and whether additional audits specifically focused on broadband performance should be implemented outside of BCAP.

    2. High-cost recipients today are subject to random and for-cause USAC audits. The Bureaus seek comment on the circumstances that would warrant examining broadband performance for cause. In particular, what events should trigger a for cause audit of speed and latency metrics? For example, failure to file a certification that service obligations are being met or a certification that standards are not being met would likely require an immediate audit. Similarly, because MBA results are publicly available, should MBA test results that demonstrate a failure to meet service obligations trigger an audit? Should consumer or other credible complaints regarding the quality of service result in an audit? If customer complaints are used to initiate an audit, the Bureaus seek comment on how this should be done. Should complaints to state/local regulatory agencies, the Commission, and/or public watchdog organizations trigger audits? If so, how many complaints over what time period and what type of complaints should be triggering events for a performance audit? Should requests from local, state, or tribal authorities be sufficient to trigger an audit? Are there other events that should trigger an audit? Proposed audit triggers should address both ensuring that performance standards are met and minimizing administrative costs.

    3. In addition, the Bureaus seek comment on whether a provider whose audit demonstrates a need for ongoing monitoring be required to pay the costs of this additional monitoring. Should results of audits be made publicly available? If not, what justifications support keeping such results private and for how long?

  4. Procedural Matters

    1. Initial Regulatory Flexibility Act Analysis. The USF/ICC Transformation Order included an Initial Regulatory Flexibility Analysis (IRFA) pursuant to 5 U.S.C. 603, exploring the potential impact on small entities of the Commission's proposal. The Bureaus invite parties to file comments on the IRFA in light of this additional notice.

    2. Initial Paperwork Reduction Act of 1995 Analysis. This document seeks comment on a potential new or revised information collection requirement. If the Commission adopts any new or revised information collection requirement, the Commission will publish a separate notice in the Federal Register inviting the public to comment on the requirement, as required by the Paperwork Reduction Act of 1995, Public Law 104-13 (44 U.S.C. 3501-3520). In addition, pursuant to the Small Business Paperwork Relief Act of 2002, Public Law 107-198, the Commission seeks specific comment on how it might ``further reduce the information collection burden for small business concerns with fewer than 25 employees.''

    3. Filing Requirements. Pursuant to Sec. Sec. 1.415 and 1.419 of the Commission's rules, interested parties may file comments on or before the dates indicated on the first page of this document. Comments may be filed using the Commission's Electronic Comment Filing System (ECFS).

      Electronic Filers: Comments may be filed electronically using the Internet by accessing the ECFS: http://fjallfoss.fcc.gov/ecfs2/.

      Paper Filers: Parties who choose to file by paper must file an original and one copy of each filing. If more than one docket or rulemaking number appears in the caption of this proceeding, filers must submit two additional copies for each additional docket or rulemaking number.

    4. Filings can be sent by hand or messenger delivery, by commercial overnight courier, or by first-class or overnight U.S. Postal Service mail. All filings must be addressed to the Commission's Secretary, Office of the Secretary, Federal Communications Commission.

      ssquf All hand-delivered or messenger-delivered paper filings for the Commission's Secretary must be delivered to FCC Headquarters at 445 12th Street SW., Room TW-A325, Washington, DC 20554. The filing hours are 8:00 a.m. to 7:00 p.m. All hand deliveries must be held together with rubber bands or fasteners. Any envelopes and boxes must be disposed of before entering the building.

      ssquf Commercial overnight mail (other than U.S. Postal Service Express Mail and Priority Mail) must be sent to 9300 East Hampton Drive, Capitol Heights, MD 20743.

      ssquf U.S. Postal Service first-class, Express, and Priority mail must be addressed to 445 12th Street SW., Washington, DC 20554.

    5. People with Disabilities: To request materials in accessible formats for people with disabilities (Braille, large print, electronic files, audio format), send an email to fcc504@fcc.gov or call the Consumer & Governmental Affairs Bureau at (202) 418-0530 (voice), (202) 418-0432 (tty).

    6. In addition, one copy of each pleading must be sent to each of the following:

      (1) Alexander Minard, Telecommunications Access Policy Division, Wireline Competition Bureau, 445 12th Street SW., 5-B442, Washington, DC 20554; email: alexander.minard@fcc.gov.

      (2) Suzanne Yelen, Industry Analysis and Technology Division, Wireline Competition Bureau, 445 12th Street SW., Room 6-B115, Washington, DC 20554; email: suzanne.yelen@fcc.gov.

      Page 69095

    7. The proceeding shall be treated as a ``permit-but-disclose'' proceeding in accordance with the Commission's ex parte rules. Persons making ex parte presentations must file a copy of any written presentation or a memorandum summarizing any oral presentation within two business days after the presentation (unless a different deadline applicable to the Sunshine period applies). Persons making oral ex parte presentations are reminded that memoranda summarizing the presentation must (1) list all persons attending or otherwise participating in the meeting at which the ex parte presentation was made, and (2) summarize all data presented and arguments made during the presentation. If the presentation consisted in whole or in part of the presentation of data or arguments already reflected in the presenter's written comments, memoranda or other filings in the proceeding, the presenter may provide citations to such data or arguments in his or her prior comments, memoranda, or other filings (specifying the relevant page and/or paragraph numbers where such data or arguments can be found) in lieu of summarizing them in the memorandum. Documents shown or given to Commission staff during ex parte meetings are deemed to be written ex parte presentations and must be filed consistent with rule Sec. 1.1206(b). In proceedings governed by rule Sec. 1.49(f) or for which the Commission has made available a method of electronic filing, written ex parte presentations and memoranda summarizing oral ex parte presentations, and all attachments thereto, must be filed through the electronic comment filing system available for that proceeding, and must be filed in their native format (e.g., .doc, .xml, .ppt, searchable .pdf). Participants in this proceeding should familiarize themselves with the Commission's ex parte rules.

      Federal Communications Commission.

      Ryan B. Palmer,

      Chief, Telecommunications Access Policy Division, Wireline Competition Bureau.

      FR Doc. 2014-27429 Filed 11-19-14; 8:45 am

      BILLING CODE 6712-01-P

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT