Home

Results 41 - 50 of 192,669 for perform. Search took 2.401 seconds.  
Sort by date/Sort by relevance
This paper presents a case for a regional push towards the development and implementation of key performance indicators within the performance areas being addressed by the ongoing APIRG projects, in order to:  Enable the participation of the AFI region in Sharing performance issues/bench marking of best practices at the regional and global level,  Develop business cases for ASBU Module implementation with investment based on KPIs;  Inform future decisions on timeliness and appropriateness of ASBU Module deployment according to a performance-driven approach;  Measure and document the performance benefits brought by the ASBU Modules implemented under APIRG Projects Action: The meeting is invited to agree to the recommendations in paragraph 3. (...) The Global Air Navigation Plan recommends a phased development approach for adoption of key Performance indicators by States, noting that States are at different levels in regard to Performance measurement as the basis for the air navigation system improvements. 2. DISCUSSION 2.1 Motivated by the need for a Data Driven Performance Management of Air Navigation Services in Kenya, an ANS operational Performance measurement and monitoring (OPMM) framework was developed as part of the Kenya Airspace Master plan 2015-2030.
Language:English
Score: 465764.9 - https://www.icao.int/ESAF/Docu...ed%20to%20ASBU%20framework.pdf
Data Source: un
Lessons Learned and Best Practices from past Performance Review Considering this is the first SPRFMO Performance Review, it has been a learning process for the Organisation. (...) Actions needed to further strengthen the effectiveness of the Performance Review Process The SPRFMO Commission has not elaborated yet on further strengthening the Performance Review process. (...) Implementation of the Recommendation of Performance Review 4. Lessons Learned and Best Practices from past Performance Review 1. 5.
Language:English
Score: 465352.64 - https://www.un.org/Depts/los/c...s/ICSP14/RFBs&RFMOs/SPRFMO.pdf
Data Source: un
The Performance Review panel consisted of three external experts and three internal experts. (...) The Report and recommendations of NAFO’s 2018 Performance Review can be found at this link: https://www.nafo.int/Portals/0/PDFs/Performance/NAFOPerformanceReviewPanelRpt2018.pdf The Report and recommendations of NAFO’s first Performance Assessment Review in 2011 can be found at this link: https://www.nafo.int/Portals/0/PDFs/Performance/PAR-2011.pdf (i) the scope of performance reviews of regional fisheries management organizations and arrangements and the importance and role of such reviews for the implementation of the Agreement; The scope of the 2018 NAFO Performance Review was determined by NAFO Contracting Parties (CPs), which formed a Working Group to develop the Performance Review’s Terms of Reference. (...) To evaluate how NAFO has responded to the outcome of 2011 NAFO Performance Review (PR 1), taking into consideration the work and practices of NAFO's bodies, subsidiary bodies and working groups to date, and also the implementation of the action plan resulting from the recommendations of the 2011 NAFO Performance Review. https://www.nafo.int/Portals/0/PDFs/Performance/NAFOPerformanceReviewPanelRpt2018.pdf https://www.nafo.int/Portals/0/PDFs/Performance/NAFOPerformanceReviewPanelRpt2018.pdf https://www.nafo.int/Portals/0/PDFs/Performance/PAR-2011.pdf https://www.nafo.int/Portals/0/PDFs/Performance/PAR-2011.pdf https://www.nafo.int/Portals/0/PDFs/com/2017/comdoc17-21.pdf https://www.nafo.int/Portals/0/PDFs/com/2017/comdoc17-21.pdf 3 2.
Language:English
Score: 465337.45 - https://www.un.org/Depts/los/c...nts/ICSP14/RFBs&RFMOs/NAFO.pdf
Data Source: un
Scope of performance reviews of RFMO/As and importance of such reviews for the implementation of the treaty Our experience has shown there is a wide degree of variation in the scope, rigour and depth of RFMO/A performance reviews. (...) Process and structure of performance review of RFMO/As, including related to independent evaluation, participation, transparency, accountability and periodicity ISSF has participated in both committee-based performance reviews and independent performance reviews. (...) Accountability: remains one of the most important aspects of the performance review process. In subsequent performance reviews it is critical that review panels are tasked to assessing the actions taken in relation to the recommendations of the previous performance review.
Language:English
Score: 465331.74 - https://www.un.org/Depts/los/c...greements/ICSP14/NGOs/ISSF.pdf
Data Source: un
Mapping of IMDRF essential principles to AI for health software - Att.1 - Spreadsheets for cluster creation Cover FGAI4H-G-038-A01 New Delhi, 13-15 November 2019 Source: WG DAISAM chairs Title: Mapping of IMDRF essential principles to AI for health software - Attachment 1 - Spreadsheets for cluster creation Abstract: This spreadsheet contains four worksheets that document how clusters were created for the main contribution. key concepts Chronology Keyword Section 5 Acceptable risks 5.1.1 109 Accuracy 5.8.1 197 Accuracy 5.12.2 245 Accuracy of measurements (trueness and precision) 7.2.1 40 Alarms 5.1.3 44 Alarms 5.1.3 235 Analytical performance 7.2.1 247 Analytical sensitivity/Limit of detection 7.2.1 249 Analytical specificity 7.2.1 301 Appropriate representation 7.2.4 77 Benefit-risk determination 5.2.1 275 Calibrators 7.2.2 136 Change management process 5.8.2 75 Clinical evaluation 5.2.1 83 Clinical experience 5.2.1 79 Clinical investigation report 5.2.1 237 Clinical performance 7.2.1 255 Clinical performance 7.2.1 145 Connectivity 5.8.3 57 Consider user knowledge 5.1.5 15 Continuous, iterative risk management 5.1.2 27 Continuous, iterative risk management 5.1.2 28 Continuous, iterative risk management 5.1.2 143 Contrast ratio of the screen 5.8.3 277 Control materials 7.2.2 271 Control procedures 7.2.1 161 Cybersecurity 5.8.5 88 Declaration of Helsinki 5.2.2 257 Diagnostic/clinical sensitivity 7.2.1 259 Diagnostic/clinical specificity 7.2.1 171 Easily understood 5.10.1 186 Easy to apply 5.12.1 184 Easy to understand 5.12.1 86 Ehtical prinicples 5.2.2 103 Electronic programmable systems 5.8.1 303 Ethnicity 7.2.4 63 Expected life of device 5.1.6 267 Expected values in normal and affected populations. 7.2.1 149 External factors related to their use (varying environment as regards level of light or noise) 5.8.3 228 Failure 5.12.3 24 Foreseeable misuse 5.1.2 305 Gender 7.2.4 307 Genetic diversity 7.2.4 210 Handling 5.12.2 153 Hardware 5.8.4 7 Health 5.1.1 21 Identify and analyze hazards 5.1.2 165 Information [Manual] 5.10.1 130 Information security (e.g., safely implement updates) 5.8.2 98 Informed consent 5.2.2 182 Instructions 5.12.1 199 Instructions 5.12.2 54 Intended usage environment 5.1.5 23 Intended use 5.1.2 62 Intended use 5.1.6 178 Intended use 5.12.1 222 Intended use 5.12.3 273 Intended use 7.2.1 293 Intended use 7.2.4 297 Intended use environment 7.2.4 295 Intended user 7.2.4 2 Intendeded conditions of use 5.1.1 212 Interpretation of results 5.12.2 155 IT networks characteristics 5.8.4 157 IT security measures 5.8.4 174 Lay user 5.12.1 188 Lay user 5.12.2 214 Lay users 5.12.3 100 Leftover specimen 5.2.2 265 Likelihood ratios 7.2.1 17 MD life cycle 5.1.2 251 Measuring interval/range 7.2.1 147 Memory 5.8.3 151 Minimum requirements 5.8.4 139 Mobile computing platforms 5.8.3 192 Near-patient testing 5.12.2 218 Near-patient testing 5.12.3 263 Negative predictive value 7.2.1 285 Numerical values 7.2.3 6 Patient benefits 5.1.1 4 Perform as intended 5.1.1 1 Performance 5.1.1 14 Performance 5.1.2 117 Performance 5.8.1 169 Performance 5.10.1 224 Performance 5.12.3 233 Performance characteristics 7.2.1 291 Performance evaluation 7.2.4 261 Positive predictive value 7.2.1 96 Pre-study protocol review 5.2.2 113 Precision 5.8.1 311 Prevalence rates 7.2.4 126 Principles of development life cycle (e.g., rapid development cycles, frequent changes, the cumulative effect of changes) 5.8.2 159 Protection against unauthorized access 5.8.4 163 Protection against unauthorized access 5.8.5 81 Published scientific literature 5.2.1 11 Quality 5.1.2 283 Reference materials of higher order 7.2.2 281 Reference measurement procedures 7.2.2 299 Relevant population 7.2.4 111 Reliability 5.8.1 309 Representative population 7.2.4 48 Residual risk information for user 5.1.4 90 Rights 5.2.2 22 Risk 5.1.2 70 Risk 5.1.9 26 Risk control 5.1.2 38 Risk control 5.1.3 32 Risk control measures 5.1.3 25 Risk elimination 5.1.2 128 Risk management (e.g., changes to system, environment, and data) 5.8.2 20 Risk management plan 5.1.2 9 Risk management system 5.1.2 208 Risk of error 5.12.2 50 Risk reduction 5.1.5 52 Risk reduction 5.1.5 121 Risk reduction 5.8.1 201 Risk reduction 5.12.2 206 Risk reduction 5.12.2 42 Risks that cannot be eliminated 5.1.3 39 Safe design 5.1.3 3 Safety 5.1.1 12 Safety 5.1.2 92 Safety 5.2.2 115 Safety 5.8.1 167 Safety 5.10.1 195 Safety 5.12.2 34 Safety principles compliance 5.1.3 176 Self-testing 5.12.1 190 Self-testing 5.12.2 216 Self-testing 5.12.3 67 Shelf life 5.1.8 72 Side-effects 5.1.9 119 Single fault conditions 5.8.1 141 Size 5.8.3 105 Software 5.8.1 107 Software as a medical device 5.8.1 253 Specimen stability 7.2.1 65 Stability 5.1.8 287 Standardized units 7.2.3 36 State of the art 5.1.3 124 State of the art 5.8.2 241 State of the art 7.2.1 60 Stress resistance 5.1.6 243 Traceability of calibrators and controls 7.2.1 279 Traceability of values 7.2.2 203 Training 5.12.2 30 Update control measures 5.1.2 18 Updating 5.1.2 180 Usage variations (user technique, usage environment) 5.12.1 46 User training 5.1.3 289 User understanding 7.2.3 230 Valid result 5.12.3 134 Validation 5.8.2 239 Validation 7.2.1 269 Validation 7.2.1 132 Verification 5.8.2 220 Verification 5.12.3 226 Warning 5.12.3 94 Well-being 5.2.2 &A Page &P ep key concepts with clusters Chronology Keyword Section Cluster 1 Cluster 2 Cluster 3 1 Performance 5.1.1 Analytical performance Clinical performance 2 Intendeded conditions of use 5.1.1 Intended use 3 Safety 5.1.1 Safety 4 Perform as intended 5.1.1 Intended use Safety 5 Acceptable risks 5.1.1 Risk and Alarms 6 Patient benefits 5.1.1 Benefit-risk Clinical performance 7 Health 5.1.1 Clinical performance 9 Risk management system 5.1.2 Risk and Alarms 11 Quality 5.1.2 Analytical performance Clinical performance 12 Safety 5.1.2 Safety 14 Performance 5.1.2 Analytical performance Clinical performance 15 Continuous, iterative risk management 5.1.2 Risk and Alarms 17 MD life cycle 5.1.2 Life cycle 18 Updating 5.1.2 Control Safety 20 Risk management plan 5.1.2 Risk and Alarms 21 Identify and analyze hazards 5.1.2 Risk and Alarms Safety 22 Risk 5.1.2 Risk and Alarms 23 Intended use 5.1.2 Intended use 24 Foreseeable misuse 5.1.2 Safety Risk and Alarms 25 Risk elimination 5.1.2 Risk and Alarms 26 Risk control 5.1.2 Risk and Alarms 27 Continuous, iterative risk management 5.1.2 Risk and Alarms 28 Continuous, iterative risk management 5.1.2 Risk and Alarms 30 Update control measures 5.1.2 Control Safety 32 Risk control measures 5.1.3 Risk and Alarms 34 Safety principles compliance 5.1.3 Safety 36 State of the art 5.1.3 38 Risk control 5.1.3 Risk and Alarms 39 Safe design 5.1.3 Safety 40 Alarms 5.1.3 Risk and Alarms Safety 42 Risks that cannot be eliminated 5.1.3 Risk and Alarms Documentation 44 Alarms 5.1.3 Risk and Alarms 46 User training 5.1.3 Documentation Intended user 48 Residual risk information for user 5.1.4 Documentation Intended use 50 Risk reduction 5.1.5 Risk and Alarms 52 Risk reduction 5.1.5 Risk and Alarms 54 Intended usage environment 5.1.5 Intended use Documentation 57 Consider user knowledge 5.1.5 Intended user 60 Stress resistance 5.1.6 Safety External factors 62 Intended use 5.1.6 Intended use 63 Expected life of device 5.1.6 Documentation Safety 65 Stability 5.1.8 Life cycle Change management Data quality 67 Shelf life 5.1.8 Life cycle 70 Risk 5.1.9 Risk and Alarms 72 Side-effects 5.1.9 Risk and Alarms 75 Clinical evaluation 5.2.1 Clinical performance 77 Benefit-risk determination 5.2.1 Benefit-risk 79 Clinical investigation report 5.2.1 Clinical performance 81 Published scientific literature 5.2.1 Clinical performance 83 Clinical experience 5.2.1 Clinical performance 86 Ehtical prinicples 5.2.2 Ethical compliance 88 Declaration of Helsinki 5.2.2 Ethical compliance 90 Rights 5.2.2 Ethical compliance 92 Safety 5.2.2 Safety 94 Well-being 5.2.2 Benefit-risk Clinical performance 96 Pre-study protocol review 5.2.2 Ethical compliance 98 Informed consent 5.2.2 Ethical compliance 100 Leftover specimen 5.2.2 Ethical compliance 103 Electronic programmable systems 5.8.1 Software 105 Software 5.8.1 107 Software as a medical device 5.8.1 109 Accuracy 5.8.1 Analytical performance Safety 111 Reliability 5.8.1 Analytical performance Technical interfaces 113 Precision 5.8.1 Safety Analytical performance 115 Safety 5.8.1 Safety 117 Performance 5.8.1 Analytical performance Intended use 119 Single fault conditions 5.8.1 Risk and Alarms 121 Risk reduction 5.8.1 Risk and Alarms 124 State of the art 5.8.2 126 Principles of development life cycle (e.g., rapid development cycles, frequent changes, the cumulative effect of changes) 5.8.2 Life cycle Safety 128 Risk management (e.g., changes to system, environment, and data) 5.8.2 Risk and Alarms 130 Information security (e.g., safely implement updates) 5.8.2 Technical interfaces 132 Verification 5.8.2 Analytical performance Control 134 Validation 5.8.2 Analytical performance Control 136 Change management process 5.8.2 Change management 139 Mobile computing platforms 5.8.3 Technical interfaces 141 Size 5.8.3 Technical interfaces 143 Contrast ratio of the screen 5.8.3 Technical interfaces 145 Connectivity 5.8.3 Technical interfaces 147 Memory 5.8.3 Technical interfaces 149 External factors related to their use (varying environment as regards level of light or noise) 5.8.3 External factors 151 Minimum requirements 5.8.4 Technical interfaces 153 Hardware 5.8.4 Technical interfaces 155 IT networks characteristics 5.8.4 Technical interfaces 157 IT security measures 5.8.4 Technical interfaces 159 Protection against unauthorized access 5.8.4 Safety Technical interfaces 161 Cybersecurity 5.8.5 Technical interfaces Safety Documentation 163 Protection against unauthorized access 5.8.5 Safety Technical interfaces 165 Information [Manual] 5.10.1 Documentation Intended use 167 Safety 5.10.1 Safety 169 Performance 5.10.1 Documentation Intended user 171 Easily understood 5.10.1 Explainability Intended user Intended use 174 Lay user 5.12.1 Intended user 176 Self-testing 5.12.1 Intended use Intended user 178 Intended use 5.12.1 Intended use 180 Usage variations (user technique, usage environment) 5.12.1 Intended use Intended user 182 Instructions 5.12.1 Documentation Intended use 184 Easy to understand 5.12.1 Explainability Intended user 186 Easy to apply 5.12.1 Intended user Documentation 188 Lay user 5.12.2 Intended user 190 Self-testing 5.12.2 Intended use Intended user 192 Near-patient testing 5.12.2 Safety Intended user Intended use 195 Safety 5.12.2 Safety 197 Accuracy 5.12.2 Analytical performance 199 Instructions 5.12.2 Documentation Intended use 201 Risk reduction 5.12.2 Risk and Alarms 203 Training 5.12.2 Documentation Intended user 206 Risk reduction 5.12.2 Risk and Alarms 208 Risk of error 5.12.2 Risk and Alarms 210 Handling 5.12.2 Documentation Intended use 212 Interpretation of results 5.12.2 Interpretability Intended user 214 Lay users 5.12.3 Intended user 216 Self-testing 5.12.3 Intended use Intended user 218 Near-patient testing 5.12.3 Safety Intended user 220 Verification 5.12.3 Analytical performance Control 222 Intended use 5.12.3 Intended use 224 Performance 5.12.3 Safety Intended user 226 Warning 5.12.3 Risk and Alarms Safety 228 Failure 5.12.3 Safety Risk and Alarms 230 Valid result 5.12.3 233 Performance characteristics 7.2.1 Analytical performance Clinical performance 235 Analytical performance 7.2.1 Analytical performance 237 Clinical performance 7.2.1 Clinical performance 239 Validation 7.2.1 Analytical performance Control 241 State of the art 7.2.1 243 Traceability of calibrators and controls 7.2.1 Analytical performance 245 Accuracy of measurements (trueness and precision) 7.2.1 Analytical performance Measurements 247 Analytical sensitivity/Limit of detection 7.2.1 Analytical performance 249 Analytical specificity 7.2.1 Analytical performance 251 Measuring interval/range 7.2.1 Analytical performance 253 Specimen stability 7.2.1 Analytical performance 255 Clinical performance 7.2.1 Clinical performance 257 Diagnostic/clinical sensitivity 7.2.1 Clinical performance 259 Diagnostic/clinical specificity 7.2.1 Clinical performance 261 Positive predictive value 7.2.1 Clinical performance 263 Negative predictive value 7.2.1 Clinical performance 265 Likelihood ratios 7.2.1 Clinical performance 267 Expected values in normal and affected populations. 7.2.1 Clinical performance 269 Validation 7.2.1 Analytical performance Control 271 Control procedures 7.2.1 Safety Control 273 Intended use 7.2.1 Intended use 275 Calibrators 7.2.2 Measurements Analytical performance 277 Control materials 7.2.2 Measurements Analytical performance 279 Traceability of values 7.2.2 Measurements Data quality 281 Reference measurement procedures 7.2.2 Measurements Data quality 283 Reference materials of higher order 7.2.2 Measurements Data quality 285 Numerical values 7.2.3 Interpretability 287 Standardized units 7.2.3 Measurements Data quality Change management 289 User understanding 7.2.3 Documentation Intended user Change management 291 Performance evaluation 7.2.4 Intended user Intended use 293 Intended use 7.2.4 Intended use 295 Intended user 7.2.4 Intended user Explainability 297 Intended use environment 7.2.4 Intended use 299 Relevant population 7.2.4 Clinical performance Data quality 301 Appropriate representation 7.2.4 Intended use Data quality 303 Ethnicity 7.2.4 Data quality Clinical performance 305 Gender 7.2.4 Clinical performance 307 Genetic diversity 7.2.4 Clinical performance 309 Representative population 7.2.4 Clinical performance Data quality 311 Prevalence rates 7.2.4 Clinical performance Data quality &"Times New Roman,Regular"&12&A &"Times New Roman,Regular"&12Page &P ai4h concepts with clusters Super-cluster name Cluster-name AI4H concept name Super-cluster ID Cluster ID AI4H concept ID Performance Analytical performance Two-class classification metrics A a 1 Performance Analytical performance Mutli-class classification metrics A a 2 Performance Analytical performance Regression metrics A a 3 Life Cycle Change management Evolution of the AI algorithm D a 1 Risk and Control Control Cross-validation B a 1 Risk and Control Control Statistical tests B a 2 Risk and Control Control Information criteria B a 3 Risk and Control Control Robustness validation B a 4 Risk and Control Control Out of sample testing B a 5 Risk and Control Control Attribution methods B a 6 Risk and Control Data quality Data diversity B b 1 Risk and Control Data quality Preprocessing B b 2 Risk and Control Data quality Normalization B b 3 Risk and Control Data quality Expert labels B b 4 Risk and Control Data quality Data collection procedure B b 5 Usability and Documentation Documentation Datasheets for data sets C a 1 Usability and Documentation Documentation Modelcards for ML models C a 2 Ethical Compliance Ethical Compliance FAT optimization objectives (“FAT training”) F a 1 Ethical Compliance Ethical Compliance FAT validation F a 2 Ethical Compliance Ethical Compliance Data acceptance and handling F a 3 Ethical Compliance Ethical Compliance Patient consent F a 4 Usability and Documentation Explainability see B) Risk and Control a) Control 6) Attribution methods (“Explainable AI (XAI)”) C b 1 Usability and Documentation Explainability Counterfactual explanations C b 2 Usability and Documentation Intended use Specification for inputs C c 1 Usability and Documentation Intended use see C) Usability and Documentation a) Documentation 1) Datasheets for data sets C c 2 Usability and Documentation Intended use see C) Usability and Documentation a) Documentation 2) Modelcards for ML models C c 3 Usability and Documentation Interpretability see B) Risk and Control a) Control 6) Attribution methods (“Explainable AI (XAI)”) C e 1 Life Cycle Life Cycle AI software life cycle D b 1 Performance Measurements see B) Risk and Control b) Data quality A d 1 Risk and Control Risk and Alarms Uncertainty quantification B c 1 Risk and Control Risk and Alarms Outlier detection B c 2 Risk and Control Safety Robust training B d 1 Dependencies Technical Interfaces Compression of AI4H models E b 1 Dependencies Technical Interfaces Response time E b 2 Dependencies Technical Interfaces Memory E b 3 Dependencies Technical Interfaces Compute E b 4 Dependencies Technical Interfaces Networking E b 5 Dependencies Technical Interfaces Operating system E b 6 Dependencies Technical Interfaces Displays E b 7 Dependencies Technical Interfaces Sensors for input data E b 8 &"Times New Roman,Regular"&12&A &"Times New Roman,Regular"&12Page &P clusters with super-clusters Clusters Super-clusters My order Analytical performance Performance 1 Benefit-risk Performance 1 Clinical performance Performance 1 Measurements Performance 1 Control Risk and control 2 Data quality Risk and control 2 Risk and Alarms Risk and control 2 Safety Risk and control 2 Documentation Usability and documentation 3 Explainability Usability and documentation 3 Intended use Usability and documentation 3 Intended user Usability and documentation 3 Interpretability Usability and documentation 3 Change management Life cycle 4 Life cycle Life cycle 4 External factors Dependencies 5 Technical interfaces Dependencies 5 Ethical compliance Ethical compliance 6 Software &"Times New Roman,Regular"&12&A &"Times New Roman,Regular"&12Page &P
Language:English
Score: 465156.6 - https://www.itu.int/en/ITU-T/f...ents/all/FGAI4H-G-038-A01.xlsx
Data Source: un
Review of the proposed revisions to the gasp APRAST/1-WP/4 Agenda Item 9 International Civil Aviation Organization FIRST MEETING OF THE ASIA PACIFIC REGIONAL AVIATION SAFETY TEAM (APRAST/1) (Bangkok, Thailand, 20-24 February 2012) Agenda Item 9: Regional Performance Framework for Safety REGIONAL PERFORMANCE FRAMEWORK FOR SAFETY (Presented by the Secretariat) SUMMARY This paper describes the principles of a performance-based approach to reduce risk and achieve continuous improvement in safety performance through the establishment and monitoring of specific performance criteria based on a data driven process. (...) APRAST/1-WP/4 Agenda Item 9 - 2 - 2.4 It is essential to use harmonized terminology in applying performance-based approach to safety. For performance measurement three basic terms are explained: a) Performance Indicator: Current/past performance, expected future performance as well as actual progress in achieving performance objectives is quantitatively expressed by means of performance indicators. (...) In other words, metrics are quantitative measure of system performance – how well the system is functioning; and c) Performance Target: Performance targets are closely associated with performance indicators: they represent the values of performance indicators that need to be reached or exceeded to consider a performance objective as being fully achieved. 3.
Language:English
Score: 465052 - https://www.icao.int/APAC/Meet...RAST%2016%20January%202012.pdf
Data Source: un
planned anticipated performance gaps ATM performance? and their reasons? SSSSTTTTEEEEPPPP 1111 1 2 3 5 What are the What are the How to measure What are the performance ATM community performance performance? targets? (...) What is the current and planned ATM performance? What are the current and anticipated performance gaps and their reasons?
Language:English
Score: 464959.5 - https://www.icao.int/WACAF/Doc...APIRG/APIRG18/Docs/wp11_en.pdf
Data Source: un
ATM performance parameters 4. What additional parameters are needed to characterize the performance of new ATM transfer capabilities? (...) What parameters are required to describe the throughput performance of ATM connections? Performance objectives, QoS class definitions and performance allocation 7. (...) 15. What is the performance of AAL type 2 switched connections? Availability performance 16.
Language:English
Score: 464910.25 - https://www.itu.int/ITU-T/2001-2004/com13/sg13-q7.html
Data Source: un
Microsoft Word - 06 Law on Performing Agricultual Activity-draft2001 06 MINISTRY OF AGRICULTURE, FORESTRY AND WATER ECONOMY LAW ON PERFORMING AGRUCULTURAL ACTIVITY (Draft Version) Skopje, April, 2001 2 I. (...) The Draft Law on performing agricultural activity directly regulates the performance of agricultural activity by some entities, physical and legal persons, conditions for performing this activity and the manner in which such entities are organized. (...) Person performing agricultural activity may register only for one of the types of persons performing agricultural activity referred to in Article 7 of this Law. 8 Article 16 Person performing agricultural activity may perform only the agricultural activity, respectively activities listed in the decision for entry into the relevant register.
Language:English
Score: 464907.03 - https://www.wto.org/english/th...c_e/mkd_e/WTACCMKD20_LEG_6.pdf
Data Source: un
This includes examples of quality objectives for performance and QoS parameters for various telecommunication services. (...) E.802 Framework and methodologies for the determination and application of QoS parameters (within approval procedure, TD16Rev1(PLEN/2), TIES account required) http://www.itu.int/pub/T-HDB-QOS.02-2004/en http://www.itu.int/rec/T-REC-E.800-199408-I/en http://www.itu.int/md/T05-SG02-060503-TD-PLEN-0016/en ITU-T Workshop on “End-to-End QoE/QoS“ Geneva, 14-16 June 2006 15 ITU-T Overview on ITU-T Handbook on QoS and Network Performance - Backup slides - QoS criteria matrix QoS Handbook ITU-T Workshop on “End-to-End QoE/QoS“ Geneva, 14-16 June 2006 16 ITU-T Overview on ITU-T Handbook on QoS and Network Performance - Backup slides - Four viewpoints of QoS (G.1000) ITU-T Workshop on “End-to-End QoE/QoS“ Geneva, 14-16 June 2006 17 ITU-T Overview on ITU-T Handbook on QoS and Network Performance - Backup slides - Universal Model (E.802) ITU-T Workshop on “End-to-End QoE/QoS“ Geneva, 14-16 June 2006 18 ITU-T Overview on ITU-T Handbook on QoS and Network Performance - Backup slides - Performance Model (E.802) ITU-T Workshop on “End-to-End QoE/QoS“ Geneva, 14-16 June 2006 19 ITU-T Overview on ITU-T Handbook on QoS and Network Performance - Backup slides - Four Market Model (E.802) Overview on ITU-T Handbook on QoS and Network Performance Overview on ITU-T Handbook on QoS and Network Performance- Content - Overview on ITU-T Handbook on QoS and Network Performance- Chapter 1 - Overview on ITU-T Handbook on QoS and Network Performance- Chapter 2 - Overview on ITU-T Handbook on QoS and Network Performance- Chapter 3 - Overview on ITU-T Handbook on QoS and Network Performance- Chapter 4 - Overview on ITU-T Handbook on QoS and Network Performance- Chapter 5 - Overview on ITU-T Handbook on QoS and Network Performance- Annex - Overview on ITU-T Handbook on QoS and Network Performance- Conclusion - Overview on ITU-T Handbook on QoS and Network Performance- Next steps - Overview on ITU-T Handbook on QoS and Network Performance- ITU-T Rec. E.802 - Overview on ITU-T Handbook on QoS and Network Performance- ITU-T Rec. E.800 - Overview on ITU-T Handbook on QoS and Network Performance- Contact details - Overview on ITU-T Handbook on QoS and Network Performance- References - Overview on ITU-T Handbook on QoS and Network Performance- Backup slides - Overview on ITU-T Handbook on QoS and Network Performance- Backup slides - Overview on ITU-T Handbook on QoS and Network Performance- Backup slides - Overview on ITU-T Handbook on QoS and Network Performance- Backup slides - Overview on ITU-T Handbook on QoS and Network Performance- Backup slides -
Language:English
Score: 464575.63 - https://www.itu.int/ITU-T/work...6/presentations/s1p2-sypli.pdf
Data Source: un