Home

Results 1 - 10 of 458,916 for tests process. Search took 5.114 seconds.  
Sort by date/Sort by relevance
Functional stage : Memory management control operation Purpose : Check that decoder handles memory management control operation. 3.2.2.3 Test Bitstreams – Weighted sample prediction process 3.2.2.3.1 Test bitstream #AVCCVWP-1 Specification : All slices are coded as I or P slice. (...) Functional stage : Weighted sample prediction process for B slice Purpose : Check that decoder handles weighted sample prediction for B slice. 3.2.2.4 Test Bitstreams – Slice of coded field 3.2.2.4.1 Test bitstream #AVCCVFI-1 Specification : All slices are coded as I or P slice. (...) Functional stage : Reconstruction of B slice with CABAC Purpose : Check that decoder reconstructs B-slices with CABAC. 3.2.2.7 Test Bitstreams – CABAC: Weighted sample prediction process 3.2.2.7.1 Test bitstream #AVCCAWP-1 Specification : All slices are coded as I or P slice.
Language:English
Score: 833953.8 - https://www.itu.int/wftp3/av-a...te/2003_05_Geneva/JVT-H035.doc
Data Source: un
This subclause will explain how this test can be accomplished when the reconstructed samples at the output of the decoding process are available. (...) Static tests are used for testing the decoding process. (...) Purpose : Check that decoder handles memory management control operations. 6.4.4 Test Bitstreams – Weighted sample prediction process 6.4.4.1 Test bitstream #AVCWP-1 Specification : All slices are coded as I or P slices.
Language:English
Score: 830552.9 - https://www.itu.int/wftp3/av-a...2004_07_Redmond/JVT-L020-L.doc
Data Source: un
This subclause will explain how this test can be accomplished when the reconstructed samples at the output of the decoding process are available. (...) Static tests are used for testing the decoding process. (...) Purpose : Check that decoder handles memory management control operations. 6.4.4 Test Bitstreams – Weighted sample prediction process 6.4.4.1 Test bitstream #AVCWP-1 Specification : All slices are coded as I or P slices.
Language:English
Score: 828886.9 - https://www.itu.int/wftp3/av-a...te/2004_03_Munich/JVT-K045.doc
Data Source: un
Many tests are performed on syntax elements in a state prior to their use in some processing stages.. 1.2.1 Requirement on output of the decoding process and timing The output of the decoding process is specified by clause 8 of ISO/IEC 14496-10|ITU-T H.264. (...) This subclause will explain how this test can be accomplished when the reconstructed samples at the output of the decoding process are available. (...) Static tests are used for testing the decoding process.
Language:English
Score: 827530.2 - https://www.itu.int/wftp3/av-a.../2003_12_Waikoloa/JVT-J011.doc
Data Source: un
Many tests are performed on syntax elements in a state prior to their use in some processing stages.. 1.2.1 Requirement on output of the decoding process and timing The output of the decoding process is specified by clause 8 of ISO/IEC 14496-10|ITU-T H.264. (...) This subclause will explain how this test can be accomplished when the reconstructed samples at the output of the decoding process are available. (...) Static tests are used for testing the decoding process.
Language:English
Score: 827530.2 - https://www.itu.int/wftp3/av-a...003_12_Waikoloa/JVT-J045d1.doc
Data Source: un
Rapid testing When rapid testing is deployed, scalability and efficiency of the overall testing process requires carefully balancing the sampling capacity with the result processing capacity. (...) Depending on the capacities available, test results processing time and other operational and regulatory factors, passengers may be allowed to leave the testing facility while waiting for the test results. (...) Whilst such emergency testing capacity should ideally be deployed along the arrival process at the airport (e.g. prior to entry border controls), other deployments models could be considered especially if the testing requirements rely on PCR tests and associated long processing times (e.g.: off-airport clinics, testing capacity at airport hotels).
Language:English
Score: 827510.8 - https://www.icao.int/safety/CA...ning-international-borders.pdf
Data Source: un
Functional stage : Memory management control operation and reference picture list reordering Purpose : Check that decoder handles memory management control operation and reference picture list reordering. 4.2.2.3 Test Bitstreams – Weighted sample prediction process Test bitstream #AVCCVWP-1 Specification : All slices are coded as I or P slice. (...) Functional stage : Reconstruction of I slice with deblocking filter process and CABAC Purpose : Check that decoder reconstructs I slice with CABAC Test bitstream #AVCCABA-2 Specification : All slices are coded as I or P slice. (...) Functional stage : Reconstruction of B slice with CABAC Purpose : Check that decoder reconstructs B-slices with CABAC. 4.2.2.10 Test Bitstreams – CABAC: Weighted sample prediction process Test bitstream #AVCCAWP-1 Specification : All slices are coded as I or P slice.
Language:English
Score: 825918.4 - https://www.itu.int/wftp3/av-a.../2003_09_SanDiego/JVT-I045.doc
Data Source: un
Functional stage : Memory management control operation and reference picture list reordering Purpose : Check that decoder handles memory management control operation and reference picture list reordering. 3.2.2.3 Test Bitstreams – Weighted sample prediction process Test bitstream #AVCCVWP-1 Specification : All slices are coded as I or P slice. (...) Functional stage : Reconstruction of I slice with deblocking filter process and CABAC Purpose : Check that decoder reconstructs I slice with CABAC Test bitstream #AVCCABA-2 Specification : All slices are coded as I or P slice. (...) Functional stage : Reconstruction of B slice with CABAC Purpose : Check that decoder reconstructs B-slices with CABAC. 3.2.2.10 Test Bitstreams – CABAC: Weighted sample prediction process Test bitstream #AVCCAWP-1 Specification : All slices are coded as I or P slice.
Language:English
Score: 825859.5 - https://www.itu.int/wftp3/av-a...003_09_SanDiego/JVT-I045d0.doc
Data Source: un
CEP General ICAO SYMPOSIUM ON INNOVATION IN AVIATION SECURITY Montréal, 21-23 October 2014 ECAC COMMON EVALUATION PROCESS OF SECURITY EQUIPMENT José María Peral Pecharromán Overview  Background  ECAC Technical Specifications  ECAC Common Evaluation Process  Main Achievements:  CEP for EDS  CEP for LEDS  CEP for SSc  CEP for ETD  Conclusions 2 Background  Testing of security equipment:  Laboratory tests  On-site acceptance test  Routine tests  Certification/Approval of equipment by ECAC Member States  Difficult for ME having no national test centres or limited resources  Different testing methods across national test centres  Complicated and costly for manufacturers  Uncertainty about the performance of deployed equipment 3 ECAC Technical Specifications  Decision of the 44 Member States of ECAC:  To establish a common process for evaluating equipment performance  Based on Common Testing Methodology  Tests conducted by designated participating test centres  Evaluation against ECAC/EU performance standards  Sharing of test results with all ECAC Member States  August 2009: Common Evaluation Process of security equipment (CEP) launched  Establishment of ECAC study groups for each category of security equipment  Review of national threat lists  Adoption of a threat list common to all 44 ECAC Member States 4 ECAC Common Evaluation Process I  Legal basis - Administrative Arrangements:  Signed by 44 Member States  Organisation of the CEP  Description of commitment of participants  Common Testing Methodology:  A single document per category of equipment  Developed within the ECAC Technical Task Force  Approved by ECAC Directors General  Participating Test Centres:  Designated by National Authority  Coordination of their CEP activities 5 ECAC Common Evaluation Process II  Test reports:  Draft test reports endorsed by CEP Management Group  Indicate compliance with ECAC/EU performance standards  Communicated to ECAC Member States (equipment meeting a standard)  Summary information on ECAC website: www.ecac-ceac.org  CEP Management Group:  Membership: Contributing Authorities, Test Centres, ECAC Secretariat  Allocates equipment to Test Centres  Endorses test reports  Monitors implementation of the CEP tests 6 ECAC Common Evaluation Process III Meets standard? Test request System allocation ECAC Secretariat Manufacturer Test Endorsement of test reports CEP Management Group Debriefing with manufacturer Participating Test Centre CEP Management Group No ECAC Secretariat Results publication Yes ECAC Secretariat ECAC Member States ECAC Web Update Development of CTM TTF Group Test request form Common Test Methodology Test reports Level 2 reports Manufacturer Closing letter 7 ECAC lists Implementation of CEP  CEP applies to four categories of security equipment:  Explosive Detection Systems (EDS)  Liquid Explosive Detection Systems (LEDS)  Security Scanners (SSc)  Explosive Trace Detection (ETD) systems  Possible extension to new categories (2015/2016):  Metal Detection Equipment (MDE) for cargo  Advanced Cabin Baggage System (ACBS)  Designated participating test centres:  Germany, France, Netherlands, Spain, Switzerland, and United Kingdom 8 Main Achievements – CEP for EDS  Since December 2009 to date (October 2014):  More than 70 EDS configurations tested from 6 manufacturers  44 EDS configurations are listed on the ECAC website • 36 – Standard 3 • 8 – Standard 2  Standard 1 for EDS expired on 1 September 2012 9 Main Achievements – CEP for LEDS  Since May 2010 to date (October 2014):  More than 180 LEDS configurations tested from 24 manufacturers  82 LEDS configurations are listed on the ECAC website • 15 – Standard 3 (4 type A and 11 type B) • 49 – Standard 2 (2 type A, 15 type B, 28 type C and 4 type D) • 18 – Standard 1 (1 type A, 4 type B, 10 type C and 3 type D) 10 Main Achievements – CEP for SSc  Since April 2012 to date (October 2014):  12 SSc configurations tested from 3 manufacturers  11 SSc configurations are listed on the ECAC website • 4 – Standard 2 • 7 – Standard 1 11 Main Achievements – CEP for ETD  Since June 2014 to date (October 2014):  Endorsement of the Common Testing Methodology (CTM) for ETD on 18 April 2014  4 ETD tested and 14 tests are currently ongoing  1 ETD configuration is listed on the ECAC website 12 Conclusions  The ECAC Common Evaluation Process:  provides a robust and flexible system for laboratory standardized tests of aviation security equipment;  results are accepted throughout ECAC Member States;  expandable to new equipment types;  open for additional Contributing Authorities and Test Centres; and  Is recognised by non-ECAC States (e.g. Australia, CA, USA).  The ECAC Common Evaluation Process is complemented by:  Best practice/guidance material on on-site acceptance tests;  Guidance material on routine tests; and  Capacity building activities. 13 Thank you Any questions?
Language:English
Score: 824477 - https://www.icao.int/Meetings/...20Peral%20Pecharroman.ECAC.pdf
Data Source: un
Static tests are used for testing the decoding process. (...) [To be fill] Functional stage: Test the reconstruction process of sample adaptive offset. (...) [To be fill] Functional stage: Test the reconstruction process of sample adaptive offset.
Language:English
Score: 824296.2 - https://www.itu.int/wftp3/av-a...CTVC-Lxxxxx_conformance_d3.doc
Data Source: un