Testing an operational support system (OSS) of an incumbent provider for compliance with a regulatory scheme
First Claim
1. A method for testing one or more operational support systems (OSSs) of an incumbent provider for compliance with a regulatory scheme, the method performed by an independent testing entity attempting to emulate a competitive provider that would access the OSSs in attempting to compete with the incumbent provider in an open market, the method comprising for each test of an OSS:
- performing a test entrance review, according to predetermined review guidelines and prior to initiation of active testing of the OSS for the test, to ensure that all required entrance criteria for the test have been satisfied;
conducting active testing of the OSS for the test according to a written detailed test plan for the test;
evaluating performance of the incumbent provider during active testing according to predetermined evaluation criteria for the test;
generating a written exception for each aspect of the test for which the testing entity determines during active testing that the incumbent provider fails to satisfy one or more applicable predetermined evaluation criteria, the exception describing such failure and the potential impact of the failure on competitive providers that would access the OSS in attempting to compete with the incumbent provider in an open market;
for each exception;
recording the exception in a master list of exceptions maintained in a computer-implemented centralized repository established for comprehensive cataloging, tracking, and reporting of exceptions across multiple tests, test domains, and jurisdications;
submitting the exception to the incumbent provider for review;
receiving a written response to the exception from the incumbent provider, the response describing one or more planned corrective activities of the incumbent provider to remediate the associated failure;
subsequent to the corrective activities being performed, conducting additional active testing of the OSS according to the detailed test plan with respect to the corresponding aspect of the test;
evaluating performance of the incumbent provider during the additional active testing according to the evaluation criteria applicable to the corresponding aspect of the test;
if the exception is cleared, based on the incumbent provider satisfying the applicable evaluation criteria during the additional active testing, generating a written closure statement for the exception; and
if the exception is not cleared, based on the incumbent provider again failing to satisfy the applicable evaluation criteria during the additional active testing, repeating the submitting, receiving, conducting, and evaluating steps until the exception is cleared or a predetermined time period for the test has elapsed;
generating test results for the test;
performing a test exit review, according to the predetermined review guidelines and subsequent to completion of active testing, to ensure that active testing was conducted in accordance with the detailed test plan, that the test results are appropriately supported, and that all required exit criteria for the test have been satisfied; and
issuing a final report for the test providing a sufficient basis for a regulatory entity administering the regulatory scheme to determine the compliance of the incumbent provider with the regulatory scheme.
10 Assignments
0 Petitions
Accused Products
Abstract
A method for testing one or more OSSs of an ILEC includes performing one or more actions associated with preparation of a test plan for testing one or more elements of the ILEC OSSs, performing one or more actions associated with implementation of a test plan in testing one or more of the ILEC OSS elements, evaluating the performance of the one or more ILEC OSS elements according to the testing, and generating one or more test results according to the evaluation, each test result corresponding to a particular ILEC OSS element tested.
95 Citations
35 Claims
-
1. A method for testing one or more operational support systems (OSSs) of an incumbent provider for compliance with a regulatory scheme, the method performed by an independent testing entity attempting to emulate a competitive provider that would access the OSSs in attempting to compete with the incumbent provider in an open market, the method comprising for each test of an OSS:
-
performing a test entrance review, according to predetermined review guidelines and prior to initiation of active testing of the OSS for the test, to ensure that all required entrance criteria for the test have been satisfied;
conducting active testing of the OSS for the test according to a written detailed test plan for the test;
evaluating performance of the incumbent provider during active testing according to predetermined evaluation criteria for the test;
generating a written exception for each aspect of the test for which the testing entity determines during active testing that the incumbent provider fails to satisfy one or more applicable predetermined evaluation criteria, the exception describing such failure and the potential impact of the failure on competitive providers that would access the OSS in attempting to compete with the incumbent provider in an open market;
for each exception;
recording the exception in a master list of exceptions maintained in a computer-implemented centralized repository established for comprehensive cataloging, tracking, and reporting of exceptions across multiple tests, test domains, and jurisdications;
submitting the exception to the incumbent provider for review;
receiving a written response to the exception from the incumbent provider, the response describing one or more planned corrective activities of the incumbent provider to remediate the associated failure;
subsequent to the corrective activities being performed, conducting additional active testing of the OSS according to the detailed test plan with respect to the corresponding aspect of the test;
evaluating performance of the incumbent provider during the additional active testing according to the evaluation criteria applicable to the corresponding aspect of the test;
if the exception is cleared, based on the incumbent provider satisfying the applicable evaluation criteria during the additional active testing, generating a written closure statement for the exception; and
if the exception is not cleared, based on the incumbent provider again failing to satisfy the applicable evaluation criteria during the additional active testing, repeating the submitting, receiving, conducting, and evaluating steps until the exception is cleared or a predetermined time period for the test has elapsed;
generating test results for the test;
performing a test exit review, according to the predetermined review guidelines and subsequent to completion of active testing, to ensure that active testing was conducted in accordance with the detailed test plan, that the test results are appropriately supported, and that all required exit criteria for the test have been satisfied; and
issuing a final report for the test providing a sufficient basis for a regulatory entity administering the regulatory scheme to determine the compliance of the incumbent provider with the regulatory scheme. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
the incumbent provider comprises an incumbent carrier that provides, in competition with one or more competitive carriers, telecommunications services to subscribers; and
the testing is performed to allow the regulatory entity to consider whether and to what extent the incumbent provider provides competitive providers non-discriminatory access to, use of, and support in connection with the one or more OSSs relative to that available to the incumbent provider in its own operations.
-
-
3. The method of claim 2, wherein:
-
the incumbent provider comprises an incumbent local exchange carrier (ILEC);
the competitive providers comprise competitive local exchange carriers (CLECs);
the regulatory entity is associated with a State within the United States of America; and
the regulatory scheme comprises Section 271 of The Telecommunications Act of 1996, codified in the laws of the United States of America at 47 U.S.C. §
§
151 et seq.
-
-
4. The method of claim 1, wherein the test entrance review comprises:
-
performing peer review in connection with the required entrance criteria, including peer review within the testing entity of the detailed test plan;
if significant unresolved issues exist according to the peer review of the required entrance criteria, submitting the issues to a management process within the testing entity for resolution;
if no significant unresolved issues exist but there has been a change in scope of the test relative to the detailed test plan, submitting the scope change to a management process within the testing entity for approval; and
for any test activity to be performed during active testing, waiting until a written sign-off has been obtained for the test activity and has been associated with a project file for the test before allowing the test activity to proceed, such that;
if all required entrance criteria are satisfied and all test activities to be performed during active testing can proceed, a written sign-off for all the test activities is obtained and associated with the project file before any of the test activities are allowed to proceed; and
if all required entrance criteria are not satisfied but one or more selected test activities to be performed during active testing can proceed, (a) a written sign-off for the selected test activities is obtained and associated with the project file before any of the selected test activities are allowed to proceed, and (b) all required entrance criteria are satisfied before remaining test activities to be performed during active testing are allowed to proceed.
-
-
5. The method of claim 1, wherein the required entrance review criteria comprise the following global entrance criteria, applicable to all OSS tests, in addition to any test-specific entrance criteria applicable to the test:
-
the detailed test plan for the test has been approved and is documented in writing prior to initiation of active testing;
any pending legal or regulatory proceedings that may impact the ability to perform the test have been concluded, prior to initiation of active testing, in a manner that allows active testing to proceed unhindered;
any measurements to be used during active testing have been approved as sufficiently supporting collection of test results for the test and are documented in writing prior to initiation of active testing;
the evaluation criteria have been approved and are documented in writing prior to initiation of active testing; and
all required components to be used in active testing are documented in writing and have been determined, prior to initiation of active testing, to be operationally ready for active testing to proceed.
-
-
6. The method of claim 1, wherein active testing comprises soliciting information from the incumbent provider using one or more written information requests according to a formal information request process, each information request being generated according to a standard information request form that is available to all personnel of the testing entity and promotes standardization of information requests across all OSS tests, each information request being traceable from submission of the information request to the incumbent provider through return of information from the incumbent provider sufficient to satisfy the information request.
-
7. The method of claim 6, wherein the formal information request process comprises:
-
preparing an information request based on the standard information request form;
submitting the information request to the incumbent provider and associating the submitted information request with a project file for the test;
receiving information from the incumbent provider responsive to the information request and associating the received information with the project file;
determining a status of the information request, if all requested information is not received from the incumbent provider within a specified timeframe, repeating the following until all the requested information is received;
submitting a supplemental information request to the incumbent provider and associating the submitted supplemental information request with the project file, receiving information from the incumbent provider responsive to the supplemental information request and associating the received information with the project file, and determining a status of the supplemental information request, andif all requested information is received from the incumbent provider within a specified timeframe and additional information is required, repeating the following until all the additional information is received;
submitting an additional information request to the incumbent provider and associating the submitted additional information request with the project file, receiving information from the incumbent provider responsive to the additional information request and associating the received information with the project file, and determining a status of the additional information request.
-
-
8. The method of claim 1, wherein active testing comprises conducting one or more structured interviews with one or more interviewees of the incumbent provider within a formal interview process according to written interview guidelines for the test, the interview guidelines for the test being generated according to generic interview guidelines providing generic instructions for conducting interviews across all OSS tests, the interview guidelines for the test being associated with a project file for the test, the interview guidelines for the test being unavailable to the incumbent provider or the interviewees, the interview process for an interview comprising:
-
generating a written interview request for the interview, submitting the interview request to the incumbent provider, and associating the submitted interview request with the project file;
confirming the date, time, and location for the interview;
conducting the interview according to the interview guidelines for the test;
completing a detailed interview report comprising notes of the interview and any written materials received during the interview, the interview report being unavailable to the incumbent provider or the interviewee and associating the interview report with the project file;
generating a brief interview summary comprising key information received during the interview but omitting any analysis of or conclusions based on the key information, submitting the interview summary to the incumbent provider for review and confirmation of the key information, and associating the submitted interview summary with the project file;
if no comments regarding the interview summary are received from the incumbent provider in a suitable format within a specified time period, indicating that no comments were received, finalizing the interview summary and interview report, and associating the finalized interview summary and interview report with the project file; and
if written comments regarding the interview summary are received from the incumbent provider in a suitable format within the specified time period, associating the comments with the project file, finalizing the interview summary and interview report in view of the comments, and associating the finalized interview summary and interview report with the project file.
-
-
9. The method of claim 1, wherein the final report is prepared according to a formal final report preparation process comprising:
-
subsequent to active testing for the test, a test lead of the testing entity issuing a draft report, the draft report being forwarded to a domain lead of the testing entity for review, the domain lead responsible for testing of the OSSs within a test domain comprising the test;
the domain lead reviewing, revising if appropriate, and approving the draft report, any revisions made by the domain lead being clearly indicated, the draft report being forwarded to a peer reviewer within the testing entity for review;
the peer reviewer reviewing, revising if appropriate, and approving the draft report, any revisions made by the peer reviewer being clearly indicated, the draft report being forwarded to a final report team within the testing entity for performance of a formal final report preparation sub-process comprising;
the final report team recording the status of the draft report, the draft report being returned to the test lead for review;
the test lead reviewing, revising if appropriate, and approving the draft report, the draft report being returned to the final report team; and
the final report team reviewing the draft report, incorporating any indicated revisions into the draft report to generate a clean version of the draft report in which any previous revisions are not indicated, and recording the status of the draft report, the draft report being forwarded to a next reviewer;
as a first next reviewer, a first-level supervisor of the domain lead within the testing entity reviewing, revising if appropriate, and approving the draft report, any revisions made by the first-level supervisor being clearly indicated, the draft report being returned to the final report team for performance of the formal final report preparation sub-process;
as a second next reviewer, a professional practice team within the testing entity reviewing, revising if appropriate, and approving the draft report, any revisions made by the professional practice team being clearly indicated, the draft report being returned to the final report team for performance of the formal final report preparation sub-process;
as a third next reviewer, a second-level supervisor of the first-level supervisor within the testing entity reviewing, revising if appropriate, and approving the draft report, any revisions made by the second-level supervisor being clearly indicated, the draft report being returned to the final report team for performance of the formal final report preparation sub-process, in which the draft report being forwarded to the next reviewer comprises the draft report being forwarded to the incumbent provider for review as to factual accuracy; and
the final report team receiving any comments from the incumbent provider based on its review of the draft report and preparing the final report in light of any comments received from the incumbent provider.
-
-
10. The method of claim 1, wherein:
-
the exception submitted to the incumbent provider is considered a draft exception;
a response to the draft exception challenging the draft exception on one or more factual bases is received from the incumbent provider;
a determination is made that the draft exception should not be withdrawn;
the draft exception is considered an open exception in response to the determination that the draft exception should not be withdrawn; and
the exception for which the response describing the corrective activities is received from the incumbent provider comprises the open exception.
-
-
11. The method of claim 10, wherein the determination that the exception should not be withdrawn is made in cooperation with the regulatory entity.
-
12. The method of claim 1, wherein the performance of the incumbent provider is evaluated as to each evaluation criterion individually and each evaluation criterion has its own associated test result, the test result for each criterion comprising one of the following:
-
satisfied, meaning that the evaluation criterion was satisfied;
satisfied with qualifications, meaning that the evaluation criterion was satisfied but one or more specific areas need improvement;
not satisfied, meaning that the evaluation criterion was not satisfied in that one or more issues were identified that would have a business impact on competitive providers attempting to compete with the incumbent provider in an open market, whether or not an exception was generated;
satisfied with exception resolved, meaning that the evaluation criterion was not initially satisfied, an exception was generated, the incumbent provider performed corrective activities, and the evaluation criterion was ultimately satisfied; and
satisfied with qualifications with exception addressed, meaning that the evaluation criterion was not initially satisfied, an exception was generated, the incumbent provider performed corrective activities, and the evaluation criterion was ultimately satisfied with qualifications.
-
-
13. A method for testing one or more operational support systems (OSSS) of an incumbent provider for compliance with a regulatory scheme, the method performed by an independent testing entity attempting to emulate a competitive provider that would access the OSSs in attempting to compete with the incumbent provider in an open market, the method comprising for each test of an OSS:
-
conducting active testing of the OSS for the test according to a written detailed test plan for the test;
evaluating performance of the incumbent provider during active testing according to predetermined evaluation criteria for the test;
generating a written exception for each aspect of the test for which the testing entity determines during active testing that the incumbent provider fails to satisfy one or more applicable evaluation criteria, the exception describing such failure and the potential impact of the failure on competitive providers that would access the OSS in attempting to compete with the incumbent provider in an open market;
for each exception, conducting an exception resolution process comprising;
recording the exception in a master list of exceptions maintained in a computer-implemented centralized repository established for comprehensive cataloging, tracking, and reporting of exceptions across multiple tests, test domains, and jurisdications, the master list of exceptions comprising an exception identifier for each exception and a status of each exception, the status for the exception being updated as appropriate during the exception resolution process;
submitting the exception to the incumbent provider for review, the exception submitted to the incumbent provider being considered a draft exception, the exception having a draft status in the master list of exceptions;
receiving a written response to the draft exception from the incumbent provider challenging the draft exception on one or more factual bases;
determining in cooperation with a regulatory entity administering the regulatory scheme that the draft exception should not be withdrawn, the draft exception being considered an open exception in response to the determination that the draft exception should not be withdrawn, the exception having an open status in the master list of exceptions;
receiving a written response to the open exception from the incumbent provider describing one or more planned corrective activities of the incumbent provider to remediate the associated failure;
subsequent to the corrective activities being performed, conducting additional active testing of the OSS according to the detailed test plan with respect to the corresponding aspect of the test;
evaluating performance of the incumbent provider during the additional active testing according to applicable evaluation criteria;
if the open exception is cleared, based on the testing entity determining in cooperation with the regulatory entity that the incumbent provider has satisfied applicable evaluation criteria during the additional active testing, generating a written closure statement for the open exception, the open exception being considered a closed exception in response to generation of the closure statement, the exception having a closed status in the master list of exceptions; and
if the open exception is not cleared, based on the testing entity determining in cooperation with the regulatory entity that the incumbent provider has again failed to satisfy applicable evaluation criteria during the additional active testing, repeating the submitting, receiving, conducting, and evaluating steps until either the open exception is cleared or a predetermined time period for the test has elapsed;
generating test results for the test; and
issuing a final report for the test providing a sufficient basis for the regulatory entity to determine the compliance of the incumbent provider with the regulatory scheme. - View Dependent Claims (14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25)
establishing an exception tracking team of the testing entity responsible for recording the exception in the master list, monitoring the status of the exception, and facilitating the exception resolution process;
a test team member of the testing entity identifying a potential exception during active testing;
a test lead of the testing entity reviewing the potential exception and determining whether the potential exception should proceed, the exception being generated only if the test lead determines that the potential exception should proceed;
one or more supervisors of the test lead within the testing entity reviewing the draft exception and determining whether the draft exception should proceed, the draft exception being submitted to the incumbent provider only if the one or more supervisors determine that the draft exception should proceed;
one or more of the test lead, supervisors, and exception tracking team reviewing the response to the draft exception received from the incumbent provider and determining whether the draft exception should proceed, the draft exception being provided to the regulatory entity for review only if the one or more of the test lead, supervisors, and exception tracking team determine that the draft exception should proceed; and
the test lead reviewing the response to the open exception received from the incumbent provider and determining whether the response addresses all appropriate aspects of the open exception, whether the response provides information supporting the described corrective activities as sufficient to remediate the open exception, and whether the corrective activities are verifiable through additional active testing, additional active testing being conducted only if these determinations are positive, if any of these determinations are not positive;
the test lead notifying the exception tracking team;
one or more of the test lead, supervisors, and exception tracking team reviewing one or more issues relevant to additional active testing;
the exception tracking team notifying the incumbent provider of one or more issues relevant to additional active testing; and
receiving another written response to the open exception from the incumbent provider.
-
-
15. The method of claim 13, wherein:
-
the centralized repository operates on one or more computer systems at one or more locations; and
the master list of exceptions are accessible to a plurality of interested parties in the form of Hypertext Markup Language (HTML) pages communicated to the interested parties using a web server associated with the centralized repository.
-
-
16. The method of claim 15, wherein the centralized repository provides tracking information for each exception in the master list, the tracking information for an exception comprising the exception identifier for the exception, the status of the exception, and one or more of a brief description of the exception, a classification of the exception, a status reason for the exception, any historical notes for the exception, and any related exceptions.
-
17. The method of claim 13, wherein the response to the open exception received from the incumbent provider comprises a revision of the response to the draft exception received from the incumbent provider.
-
18. The method of claim 13, further comprising:
-
submitting the open exception and the response to the open exception received from the incumbent provider to one or more competitive providers; and
receiving comments from the competitive providers regarding the additional active testing in light of the open exception and the response to the open exception received from the incumbent provider.
-
-
19. The method of claim 13, further comprising receiving a revised response to the open exception from the incumbent provider at any time before the open exception is closed.
-
20. The method of claim 13, further comprising:
-
if the response to the open exception received from the incumbent provider does not provide a sufficient basis for the additional active testing, notifying the incumbent provider that the exception resolution process cannot proceed; and
proceeding with the exception resolution process only after information is received from the incumbent provider providing a sufficient basis for the additional active testing.
-
-
21. The method of claim 13, further comprising generating an observation in response to the testing entity determining, during active testing of the OSS for the test that a system, process, policy, or practice characteristic of the incumbent provider might result, but in contrast to an exception will not necessarily result, in the incumbent provider failing to satisfy a predetermined evaluation criterion for the test.
-
22. The method of claim 21, wherein the observation comprises one of:
-
a first type of observation comprising a question regarding the OSS for the test that cannot be answered without additional guidance from the incumbent provider; and
a second type of observation identifying a potential deficiency, before an exception is generated with respect to the deficiency, such that the incumbent provider may address the potential deficiency in a satisfactory and timely manner to prevent an exception from being generated with respect to the deficiency.
-
-
23. The method of claim 22, wherein:
-
an observation of the first type may be cleared, upgraded to an observation of the second type, or upgraded to an exception depending on a response to the observation from the incumbent provider; and
an observation of the second type may be cleared or upgraded to an exception, but not upgraded to an observation of the first type, depending on a response to the observation from the incumbent provider.
-
-
24. The method of claim 21, wherein one or more exceptions in the master list of exceptions are not based on a previously generated observation.
-
25. The method of claim 21, wherein the observation is generated as a draft observation and has a draft status until a determination is made to upgrade the observation to an open observation having an open status.
-
26. A system used in connection with testing one or more operational support systems (OSSs) of an incumbent provider by an independent testing entity for compliance with a regulatory scheme, the testing entity attempting to emulate a competitive provider that would access the OSSs in attempting to compete with the incumbent provider in an open market, the system operating on one or more computer systems at one or more locations and comprising:
-
a centralized repository supporting comprehensive cataloging, tracking, and reporting of exceptions across multiple tests, test domains, and jurisdictions, the centralized repository maintaining a master list of exceptions comprising an exception identifier for each exception and a status of each exception, the status for each exception being updated as appropriate during an exception resolution process, at least one exception;
having been generated for a test in response to the testing entity determining, during active testing of the OSS for the test according to a written detailed test plan for the test, that the incumbent provider failed to satisfy a predetermined evaluation criterion for the test;
describing such failure and the potential impact of the failure on competitive providers that would access the OSS in attempting to compete with the incumbent provider in an open market;
having been recorded in the master list of exceptions;
having been submitted to the incumbent provider for review as a draft exception having a draft status in the master list of exceptions;
having been allowed to proceed as an open exception having an open status in the master list of exceptions in response to the testing entity receiving a written response to the draft exception from the incumbent provider challenging the draft exception on one or more factual bases and the testing entity determining in cooperation with a regulatory entity administering the regulatory scheme that the draft exception should not be withdrawn; and
having been designated as a closed exception having a closed status in the master list of exceptions in response to the testing entity receiving a written response to the open exception from the incumbent provider describing one or more planned corrective activities of the incumbent provider to remediate the associated failure, the testing entity conducting additional active testing of the OSS according to the detailed test plan with respect to the corresponding aspect of the test subsequent to the corrective activities being performed, the testing entity evaluating performance of the incumbent provider during the additional active testing according to the evaluation criterion, the testing entity clearing the exception based on the testing entity determining in cooperation with the regulatory entity that the incumbent provider has satisfied the evaluation criterion during the additional active testing, and the testing entity generating a written closure statement for the exception; and
a web server operable to make the master list of exceptions accessible to a plurality of interested parties in the form of Hypertext Markup Language (HTML) pages communicated to the interested parties from the web server. - View Dependent Claims (27, 28, 29, 30, 31, 32, 33, 34, 35)
at least one exception has remained an open exception in response to the testing entity determining in cooperation with the regulatory entity that the incumbent provider has again failed to satisfy the corresponding evaluation criterion during additional active testing conducted for the exception; and
the steps of submitting the open exception to the incumbent provider, receiving a response to the open exception from the incumbent provider describing planned corrective activities, conducting additional active testing subsequent to the corrective activities being performed, and determining in cooperation with the regulatory entity whether the incumbent provider has satisfied the evaluation criterion during the additional active testing must be repeated until either the open exception is cleared and becomes a closed exception or a predetermined time period for the test has elapsed.
-
-
31. The system of claim 26, wherein:
-
the centralized repository further supports comprehensive cataloging, tracking, and reporting of observations across multiple tests, test domains, and jurisdictions, the centralized repository maintaining for each observation an observation identifier and a current status; and
each observation maintained in the central repository has been generated in response to the testing entity determining, during active testing of an OSS for a test according to a written detailed test plan for the test, that a system, process, policy, or practice characteristic of the incumbent provider might result, but in contrast to an exception will not necessarily result, in the incumbent provider failing to satisfy a predetermined evaluation criterion for the test.
-
-
32. The system of claim 31, wherein each observation comprises one of:
-
a first type of observation comprising a question regarding the OSS for the test that cannot be answered without additional guidance from the incumbent provider; and
a second type of observation identifying a potential deficiency, before an exception is generated with respect to the deficiency, such that the incumbent provider may address the potential deficiency in a satisfactory and timely manner to prevent an exception from being generated with respect to the deficiency.
-
-
33. The system of claim 32, wherein:
-
an observation of the first type may be cleared, upgraded to an observation of the second type, or upgraded to an exception depending on a response to the observation from the incumbent provider; and
an observation of the second type may be cleared or upgraded to an exception, but not upgraded to an observation of the first type, depending on a response to the observation from the incumbent provider.
-
-
34. The system of claim 31, wherein one or more exceptions in the master list of exceptions are not based on a previously generated observation.
-
35. The system of claim 31, wherein each observation is generated as a draft observation and has a draft status until a determination is made to upgrade the to an open observation having an open status.
Specification