hash
stringlengths
32
32
doc_id
stringlengths
5
12
section
stringlengths
4
595
content
stringlengths
0
6.67M
17e8174f94d72a34a3d8a81dbfebc7a5
22.882
6 Consolidated potential requirements
17e8174f94d72a34a3d8a81dbfebc7a5
22.882
6.1 Energy consumption as service criteria
This subclause contains the requirements related to energy consumption as service criteria and supporting energy credit limit for specific service. Table 6.1-1 – Consolidated requirements on energy consumption as service criteria CPR # Consolidated Potential Requirement Original PR # Comment CPR 6.1-1 Subject to operator’s policy, the 5G system shall support subscription policies that define a maximum energy credit limit for services for services without QoS criteria. NOTE 1: The definition of subscription is in TS 21.905. P.R 5.5.6-1 Definition of subscription is in TS 21905 maximum energy credit limit: a policy establishing an upper bound on the quantity of energy used by the 5G system to provide services provided to a specific subscriber.(clause 3.1) CPR 6.1-2 Subject to operator’s policy, the 5G system shall support a means to associate energy consumption with charging information based on subscription policies for services without QoS criteria. P.R 5.5.6-2 Charging aspect CPR 6.1-3 Subject to operator’s policy, the 5G system shall support a mechanism to perform energy credit limit control for services without QoS criteria. NOTE 2: The result of the credit control is not specified by this requirement. NOTE 3: Credit control [18] compares against a credit control limit. It is assumed charging events are assigned a corresponding energy consumption and this is compared against a policy of energy credit limit. It is assumed there can be a new policy to limit energy consumption allowed. P.R 5.5.6-4 CPR 6.1-4 Subject to operator’s policy, the 5G system shall support a means to define and enforce subscription policies that define a maximum energy consumption for services without QoS criteria. NOTE 4: The granularity of the subscription policies can either apply to the subscriber (all services), or to particular services. PR 5.1.6-1, PR 5.1.6-2 CPR 6.1-6 The 5G system shall provide a mechanism to include the ratio of renewable energy as part of charging information. NOTE 5: Calculation of ratio of renewable energy as described in the preceding requirement is done by means of averaging or applying a statistical model. The requirements do not imply that some form of 'real time' monitoring is required. PR.5.12.6-1, PR.5.12.6-2 CPR 6.1-7 Subject to operator policy and agreement with 3rd party, the 5G system shall provide a mechanism to support the selection of an application server based on energy consumption information associated with a set of application servers. NOTE 6: Energy consumption information can include ratio of renewable energy and carbon emission information when available. Calculation of ratio of renewable energy as described in the preceding requirement is done by means of averaging or applying a statistical model. The requirements do not imply that some form of 'real time' monitoring is required. PR.5.14.6-2 CPR 6.1-8 Subject to user consent and operator policy, 5G system shall be able to provide means to modify a communication service based on energy related information criteria based on subscription policies. NOTE 7: Energy consumption information can include ratio of renewable energy and carbon emission information when available. Calculation of ratio of renewable energy as described in the preceding requirement is done by means of averaging or applying a statistical model. The requirements do not imply that some form of 'real time' monitoring is required. PR.5.15.6-1 CPR 6.1-9 Subject to user consent, operator policy and regulatory requirements, the 5G system shall be able to provide means to operate part or the whole network according to energy consumption requirements, which may be based on subscription policies or requested by an authorized 3rd party. PR.5.9.6-1, PR.5.13.6.1, PR.5.13.6.2
17e8174f94d72a34a3d8a81dbfebc7a5
22.882
6.2 Different energy states of network elements and network functions
This subclause contains the requirements related to different energy states of network elements and network functions and dynamic changes. Table 6.2-1 – Consolidated Requirements on different energy states of network elements and network functions CPR # Consolidated Potential Requirement Original PR # Comment CPR 6.2-1 The 5G system shall support different energy states of network elements and network functions. PR 5.2.6-1 CPR 6.2-2 5G system shall support dynamic changes of energy states of network elements and network functions. NOTE 1: This requirement also include the condition when providing network elements or functions to an authorised 3rd party, the dynamic changes can be based on pre-configured policy (the time of changing energy states, which energy state map to which level of load, etc.) PR 5.2.6-2 CPR 6.2-3 The 5G system shall support different charging mechanisms based on the different energy states of network elements and network functions. PR 5.2.6-3 Charging aspect NOTE 2: These requirements assume it is possible that there is new energy states of network elements and network functions.
17e8174f94d72a34a3d8a81dbfebc7a5
22.882
6.3 Monitoring and measurement related to energy consumption and efficiency
This subclause contains the requirements of monitoring and measurement related to energy consumption and efficiency. Table 6.3-1 –Consolidated Requirements on monitoring and measurement related to energy efficiency CPR # Consolidated Potential Requirement Original PR # Comment CPR 6.3-1 Subject to operator's policy, the 5G network shall support energy consumption monitoring at per network slice and per subscriber granularity. NOTE 1: Energy consumption monitoring as described in the preceding requirement is done by means of averaging or applying a statistical model. The requirement does not imply that some form of 'real time' monitoring is required. The granularity of the subscription policies can either apply to the subscriber (all services), or to particular services. PR 5.1.6-4 CPR 6.3-2 Subject to operator’s policy and agreement with 3rd party, the 5G system shall be able to monitor energy consumption for serving this 3rd party, independently from NG-RAN deployment scenarios. NOTE2: The granularity of energy consumption measurement could vary according to different situations, for example, when several services share a same network slice, etc. NOTE 3: The energy consumption information can be related to the network resources of network slice, NPNs, etc. PR.5.3.6-1 PR.5.4.6-1 PR.5.6.6-1 CPR 6.3-4 Subject to operator policy and regulatory requirements, the 5G system shall be able to monitor the energy consumption for serving the 3rd party, together with the network performance statistic information for the services provided by that network, through same update rate e.g. hourly or daily, NOTE 4: The network performance statistic information could be the data rate, packet delay and packet loss, etc. PR.5.7.6-1
17e8174f94d72a34a3d8a81dbfebc7a5
22.882
6.4 Information exposure related to energy consumption and efficiency
This subclause contains the requirements related to information exposure related to energy consumption and efficiency. Table 6.4-1 – Consolidated Requirements on information exposure related to Energy Consumption CPR # Consolidated Potential Requirement Original PR # Comment CPR 6.4-1 Subject to operator’s policy and agreement with 3rd party, the 5G system shall be able to expose information on energy consumption forserving this 3rd party. NOTE 1: Energy consumption information can include ratio of renewable energy and carbon emission information when available. The reporting period could be set, e.g., on monthly or yearly basis and can vary based on location. NOTE 2: The energy consumption information can be related to the network resources of network slice, NPNs, etc. PR.5.3.6-1 PR.5.4.6-1 PR.5.9.6-2 PR 5.10.6-1 CPR 6.4-2 Subject to operator’s policy, the 5G system shall support a means to expose energy consumption to authorized third parties for services, including energy consumption information related to the condition of energy credit limit (e.g. when the energy consumption is reaching the energy credit limit). P.R 5.5.6-3 CPR 6.4-3 Subject to operator policy, the 5G system shall provide means for the trusted 3rd party, to configure which network performance statistic information (e.g. the data rate, packet delay and packet loss) for the communication service provided to the 3rd party, needs to be exposed along with the information on energy consumption for serving this 3rd party. PR.5.7.6-2 CPR 6.4-4 Based on operator policy and agreement with 3rd party, the 5G system shall be able to expose energy consumption information and prediction on energy consumption of the 5G network per application service to the 3rd party. PR.5.8.6-1 PR.5.8.6-2 CPR 6.4-5 Subject to operator’s policy and agreement with 3rd party, the 5G system shall support a mechanism for the 3rd party to provide current or predicted energy consumption information over a specific period of time. NOTE 1: Energy consumption information can include ratio of renewable energy used for providing application services on periodic basis. PR.5.14.6-1
17e8174f94d72a34a3d8a81dbfebc7a5
22.882
6.6 Temporary communication service pooling over a geographical area for energy saving
This subclause contains the requirements related to the temporary communication service pooling over a geographical area for energy saving. Table 6.6-1 – Consolidated requirements on temporary communication service pooling over a geographical area for energy saving CPR # Consolidated Potential Requirement Original PR # Comment CPR 6.6-1 Subject to regulatory requirements and operators’ policies, the 5G system shall enable an operator to temporarily serve UEs of other operators within a geographical area for the purpose of saving energy of the other operators. NOTE 1: the other operators are assumed to stop providing communication service over their own network infrastructure within the same geographical area to save energy during that time. NOTE 2: policies may include predefined times/locations, energy consumption/efficiency thresholds, etc. NOTE 3: it is assumed that the 5G system can collect charging information associated with serving UEs of other operators PR.5.11.6-1, PR.5.11.6-2, PR.5.11.6-4
17e8174f94d72a34a3d8a81dbfebc7a5
22.882
7 Conclusion and recommendations
This document analyzes a number of use cases to support energy efficiency as service criteria. The resulting potential consolidated requirements have been captured in clauses 6. It is recommended to proceed with normative work based on the identified consolidated requirements. Annex A: Existing energy efficiency standardisation A.1 Overview of existing energy efficiency standardisation In ETSI, GSMA and 3GPP, there were many reports, studies, specifications related to energy efficiency. And now there are also ongoing 3GPP R18 studies on energy efficiency in both SA5 and RAN. In ETSI, existing specifications cover several aspects of energy efficiency, which include energy efficiency metrics and measurement methods for mobile core equipment, metrics and methods to measure energy performance of Mobile Radio Access Networks, measurement and monitoring of power, energy and environmental parameters for ICT equipment in telecommunications. [2] [3] GSMA has done lots of work in assessing energy consumption in different fields within a communication system. In "Going green: benchmarking the energy efficiency of mobile", GSMA states that 73% of the energy of the participating operators is consumed in the radio access network (RAN). The network core (13%), owned data centres (9%) and other operations (5%) account for the rest. [4] The statistics show that energy efficiency is an end-to-end issue. In 3GPP, energy efficiency has been studied in SA, SA5 and RAN. SA have studied system requirements and principles and provided an Energy Efficiency Control Framework. [5] SA5 has specified concepts, use cases, requirements and solutions for energy efficiency assessment and optimization for energy saving, as well as Energy Efficiency (EE) KPIs. [6] [7] RAN EE study has concentrated on the definition of network energy consumption models, evaluation methodology and KPIs, also studied and identified techniques on the gNB and UE sides to improve network energy savings in terms of both BS transmission and reception. [8] A.2 Energy efficiency KPIs 3GPP Energy Efficiency KPI definitions are under SA5 (Telecom Management) responsibility. They are based on measurements collected on RAN or CN network elements / network functions via OA&M. The KPI calculation is a generalisation of the work in ETSI TC EE. Figure A.2-1 below shows the KPI derivation with notes to the source specifications. Figure A.2-1: KPI derivation and sources A.3 Summary of existing energy efficiency standards Table A.2-1 below shows the standards relevant to the present document with a synopsis taken from the Scope clause of the standard. Table A.3-2: List of EE specifications SDO Group Standard Summary 3GPP SA TR 21.866: "Study on Energy Efficiency Aspects of 3GPP Standards" [5] Identifies and studies the key issues and the potential solutions in defining Energy Efficiency Key Performance Indicators and the Energy Efficiency optimization operations in existing and future 3GPP networks. 3GPP SA5 TS28.310: "Management and orchestration; Energy efficiency of 5G" [6] Specifies concepts, use cases, requirements and solutions for the energy efficiency assessment and optimization for energy saving of 5G networks. 3GPP SA5 TS28.552: "Management and orchestration; 5G performance measurements" [11] Specifies the performance measurements for 5G networks including network slicing. Performance measurements for NG-RAN are defined in this document, and some L2 measurement definitions are inherited from TS 38.314. The performance measurements for 5GC are all defined in this document. Related KPIs associated with those measurements are defined in TS 28.554 [12]. 3GPP SA5 TS28.554: "Management and orchestration; 5G end to end Key Performance Indicators (KPI)" [12] Specifies end-to-end Key Performance Indicators (KPIs) for the 5G network and network slicing 3GPP SA5 TS28.622: "Telecommunication management; Generic Network Resource Model (NRM) Integration Reference Point (IRP); Information Service (IS)" [13] Specifies the Generic network resource information that can be communicated for telecommunication network management purposes, including management data about energy efficiency 3GPP SA5 TR28.813: "Management and orchestration; Study on new aspects of Energy Efficiency (EE) for 5G" [7] Investigates the opportunities for defining new Energy Efficiency (EE) KPIs and new Energy Saving (ES) solutions. 3GPP RAN1 TR 38.864: "Study on network energy savings for NR" [8] Investigates network energy consumption modelling, techniques for network energy saving, evaluation of gains and impact. ETSI TC EE ETSI ES 203 228: "Environmental Engineering (EE); Assessment of mobile network energy efficiency" [3] Defines the topology and level of analysis to assess the energy efficiency of mobile networks (excluding terminal) ETSI TC EE ETSI ES 202 336-1: "Environmental Engineering (EE); Monitoring and Control Interface for Infrastructure Equipment (Power, Cooling and Building Environment Systems used in Telecommunication Networks) Part 1: Generic Interface" [9] Defines monitoring and control of Infrastructure Environment i.e. power, cooling and building environment systems for telecommunication centres and access network locations. ETSI TC EE ETSI ES 202 336-12: "Environmental Engineering (EE); Monitoring and control interface for infrastructure equipment (power, cooling and building environment systems used in telecommunication networks); Part 12: ICT equipment power, energy and environmental parameters monitoring information model" [10] Defines measurement and monitoring of power, energy and environmental parameters for ICT equipment in telecommunications or datacenter or customer premises Annex B: Change history Change history Date Meeting TDoc CR Rev Cat Subject/Comment New version 2022-08 SA1#99-e S1-222412 TR skeleton 0.0.0 2022-08 SA1#99-e Inclusion of approved pCRs from SA1 #99e: S1-222413: S1-222414: S1-222415: S1-222416 0.1.0 2022-11 SA1#100 Inclusion of approved pCRs from SA1 #100: S1-223431: S1-223654: S1-223656: S1-223657: S1-223658 0.2.0 2023-02 SA1#101 Inclusion of approved pCRs from SA1 #101: S1-230061: S1-230418: S1-230445: S1-230589: S1-230680: S1-230685: S1-230790: S1-230791: S1-230792: S1-230793 0.3.0 2023-03 SA#99 SP-230226 MCC clean-up presentation to SA#99 1.0.0 2023-05 SA1#102 Inclusion of approved pCRs from SA1 #102: S1-231179: S1-231180: S1-231277: S1-231531: S1-231532: S1-231533: S1-231540: S1-231543: S1-231550: S1-231552: S1-231553: S1-231770: S1-231771: S1-231772 1.1.0 2023-06 SA#100 SP-230519 MCC clean-up for approval by SA#100 2.0.0 2023-06 SA#100 SP-230519 Raised to v.19.0.0 by MCC following approval by SA#100 19.0.0 2023-09 SA#101 SP-231029 0007 D Quality improvements 19.1.0 2023-09 SA#101 SP-231029 0005 1 F Addressing EN 5.11 on pooling 19.1.0 2023-09 SA#101 SP-231029 0004 1 B Adding conclusion in TR 22.882 19.1.0 2023-09 SA#101 SP-231029 0006 2 F Consolidation of 5.11 PRs on pooling 19.1.0 2023-09 SA#101 SP-231029 0002 3 F Updating existing CPR in TR 22.882 19.1.0 2023-09 SA#101 SP-231029 0001 3 F Addition of terminolgy for energy consumption as a service criteria 19.1.0 2023-09 SA#101 SP-231029 0003 4 B Updating CPR with newly agreed PRs 19.1.0 2023-09 SA#101 SP-231029 0008 3 B Updating CPR with newly agreed PRs 19.1.0 2023-12 SA#102 SP-231414 0009 2 B Consolidation requirements update with leftover PRs 19.2.0 2024-03 SA#103 SP-240202 0011 1 D Editorial corrections in FS_EnergyServ 19.3.0
5eaf1b94c89939407995fc52470817be
22.890
1 Scope
The present document analyses use cases of smart railway station such as station operation monitoring and control, passenger supporting services and evolution use cases of business and performance applications currently included in TR22.989 in order to derive potential requirements.
5eaf1b94c89939407995fc52470817be
22.890
2 References
The following documents contain provisions, which through reference in this text, constitute provisions of the present document. - References are either specific (identified by date of publication, edition number, version number, etc.) or non‑specific. - For a specific reference, subsequent revisions do not apply. - For a non-specific reference, the latest version applies. In the case of a reference to a 3GPP document (including a GSM document), a non-specific reference implicitly refers to the latest version of that document in the same Release as the present document. [1] 3GPP TR 21.905: "Vocabulary for 3GPP Specifications". [2] UIC MG-7900v2.0.0, Future Railway Mobile Communication System – Use cases, Feb. 2020. [3] UIC FU-7100v5.0.0, Future Railway Mobile Communication System – User Requirements Specification, Feb. 2020. [4] TTA TTAK.KO-06.0507/R1, Requirements for Smart Railway Device - Information Model, Dec. 2020. [5] TTA TTAK.KO-06.0508/R1, Requirements for Smart Railway Platform - Information Model, Dec. 2020. [6] 3GPP TR 22.990. “Study on off-network for rail”. [7] 3GPP TS 22.280: " Mission Critical Services Common Requirements (MCCoRe)" [8] 3GPP TS 22.282: “Mission Critical (MC) data”. [9] 3GPP TS 22.289: “Mobile communication system for railways”.
5eaf1b94c89939407995fc52470817be
22.890
3 Definitions and abbreviations
5eaf1b94c89939407995fc52470817be
22.890
3.1 Definitions
For the purposes of the present document, the terms and definitions given in 3GPP TR 21.905 [1] and the following apply. A term defined in the present document takes precedence over the definition of the same term, if any, in 3GPP TR 21.905 [1]. Mobile Intelligent Assistant: 5G enabled robot with autonomous movements and artificial intelligence to support passengers in the Railway Smart Station. Railway Smart Station: a train station where the 5G and other ICT technologies such as IoT and AI, are used for providing assisting railway services. Zone: A 2-dimensional region of a pre-determined size. Zone resolution: The pre-determined size of the given zone.
5eaf1b94c89939407995fc52470817be
22.890
3.2 Abbreviations
For the purposes of the present document, the abbreviations given in 3GPP TR 21.905 [1] and the following apply. An abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in 3GPP TR 21.905 [1]. AI Artificial Intelligence FRMCS Future Railway Mobile Communication System ITS Intelligent Transport System IoT Internet of Things MCX Mission Critical X, with X = PTT or X= Video or X= Data
5eaf1b94c89939407995fc52470817be
22.890
4 Overviews
The railway station is a major touchpoint to customers including passengers. The railway community is considering the Railway Smart Station services for the railway station operations and customers. Many railway companies are planning to adopt the Railway Smart Station services and some companies have already launched their own projects to provide such services over Mission Critical Services (MCPTT, MCVideo, MCData Services). TR 22.889/989 provides basic requirements and focuses on the railway communication system having three categories critical, performance and business communications. While the critical and performance communications address the safe movement of trains, business communications provide value-added services for passengers. The Railway Smart Station services provide assistance of station operation and value-added services for passengers, e.g., subway station evacuation guidance via the passenger's UE. The objectives of this technical report are as follows: • Study use cases related to Railway Smart Station Services and deduce requirements from those. For example: • Use cases of station operation monitoring and control • Use cases of passenger supporting services • Evolution use cases of business and performance applications currently included in TR22.889/989 • Analysis gaps between the requirements identified and the functionality already provided by 3GPP (e.g., 22.261, 22.228, or MCX specifications)
5eaf1b94c89939407995fc52470817be
22.890
5 Performance communication applications related use cases
5eaf1b94c89939407995fc52470817be
22.890
5.1 Emergency use case of smart station – fire in station
5eaf1b94c89939407995fc52470817be
22.890
5.1.1 Description
The fire in station use case is to describe an emergency situation and managing the situation in context of railway smart station. Through this use case work, technical keywords are deduced and numbers of potential requirements are come from the keywords.
5eaf1b94c89939407995fc52470817be
22.890
5.1.2 Pre-conditions
Fire detectors are 3GPP UEs. The passengers have their own 3GPP UEs as their smart phone. There is a fire situation in somewhere of a railway smart station. A fire detector senses the situation and report to railway smart station system via 3GPP network. Some passengers also recognize and register the situation to the station system via railway smart station app in their 3GPP UEs and some of them notify it to the fire fighting force and/or police.
5eaf1b94c89939407995fc52470817be
22.890
5.1.3 Service Flows
1. The Railway Smart Station System indicates the fire situation from the fire detector. While the fire is getting bigger and bigger, the number of fire detectors providing sensing information is also increasing. The system indicates the location and directions of the fire spreading by analysing the information from the detectors. 2. The Railway Smart Station System starts a fire emergency protocol. It declares the situation to the near firehouses and police stations, and provides the station information via a specific interface that is not in scope of 3GPP, e.g. the station map. It finds available devices in the station and controls the devices to handle the situation. For example, the system controls fire sprinklers to start supress the fire efficiently. An evacuation warning and emergency exit information are announced to people in the station via the audio broadcasting devices. Emergency messages are sent to the people as well. The system controls emergency lamps and direction lights to give guidance information for evacuating the people. 3. The system notices the situation to the neighbour stations and the incoming trains. The trains make emergency stop at a safe place and evacuate their passengers. The system sends emergency protocol information to the railway workers in the station depends on their group roles in the protocol. 4. The system video-streams the fire site using a camera near the fire place. The people in the station could watch the streaming and get the information of the fire site, e.g. location and range of the site. 5. The system gets UEs information and sets up roles to UEs of the workers, firefighters and police officers, and downloads information on the UEs to support the role via interfacing the systems of the fire department and the police. The role is changed dynamically depends on the status of duty of the workers, firefighters and officers. 6. The system and the UEs of the people get cooperation to count the number of people in the station. The people is included in the rescue group autonomously.
5eaf1b94c89939407995fc52470817be
22.890
5.1.4 Post-conditions
The devices in the station are monitored and controlled by the Railway Smart Station System. The railway workers, firefighters, police officers are on their duty using their UEs. The people in the station and the incoming train, escape the station and move to the safe place by using their UEs which are interfaced by the system.
5eaf1b94c89939407995fc52470817be
22.890
5.1.5 Existing features partly or fully covering the use case functionality
The role and group management are fully covered by 3GPP system and MCX framework.
5eaf1b94c89939407995fc52470817be
22.890
5.1.6 Potential New Requirements needed to support the use case
[PR-5.1.6-1] The 5G system shall support to access various networks which are used to monitor and control features for the devices in the station. [PR-5.1.6-2] The 5G system shall support to connect massive number of devices in a specific area in the station, which is defined to monitor and/or control. [PR-5.1.6-3] The 5G system should support to interface external system to control the UEs that belongs to the external system. [PR-5.1.6-4] The 5G system should support counting number of UEs in a specific area in the station under the condition of category of UE and status of UE. [Editor's note: The requirements are FFS.]
5eaf1b94c89939407995fc52470817be
22.890
5.2 Multiple trains' stops at the same platform
5eaf1b94c89939407995fc52470817be
22.890
5.2.1 Description
It is needed to reduce train intervals to increase the track capacity. If the train intervals are less than some threshold, multiple trains should stop at the same platform. This is because we can reduce the interval between trains, but for safety reasons, reducing the time for passengers to get on and off and increasing the speed of trains entering the station is limited. In general, it takes more than 1 minute for the train to enter and exit the platform, including the time for passengers to get on and off, so if the train interval becomes shorter than 1 minute, there could be two trains on the platform. The scenario when two trains stop at the same platform is as follows: - The passenger information system (PIS) of a smart station displays that Train 1 will stop in front of the platform, and Train 2 will stop at the back of the platform. Train 1 & 2 may have different routes. - Step 1: A previous train is departing the platform, and Train 1 enters the platform while a part of the previous train is still at the platform. According to the previous train's location, Train 1 slows down its speed and goes in front of the platform. - Step 2: After the previous train left the platform, Train 1 stops in front of the platform. - Step 3: While Train 1 is stopping for passengers to get on and off, Train 2 enters the platform - Step 4: Train 2 stops behind Train 1 and opens its doors. While the passengers of Train 2 are getting on and off, Train 1 start to depart the platform. Figure 6.2.1-1 Two trains' stops at the same platform
5eaf1b94c89939407995fc52470817be
22.890
5.2.2 Pre-conditions
- Each train has at least one onboard UE (i.e., FRMCS UE), supporting both on-network and off-network communications. - There is an edge server per station and the edge server can determine which platform trains stop and transmit/receive data to onboard UEs of Trains 1&2 through the 3GPP network. - Each edge server can transmit/receive information to the PIS. - Onboard UEs know the identities of the edge server on the train route.
5eaf1b94c89939407995fc52470817be
22.890
5.2.3 Service Flows
1. Train 1 is stopping at Platform A of the smart station, and Train 2 is approaching. Trains 1 & 2 are connected and authorized to transmit/receive information to the server at the smart station. 2. The server determines that Train 2 stops at Platform A. The server informs Train 2 of the platform where Train 2 will stop, the stop location within the platform, and the existence of Train 1 at the platform. Also, the server informs Train 1 that Train 2 will stop behind Train 1. 3. Train 2 establishes a connection with Train 1 through on-network and then notifies the server of the connection establishment with Train 1. Trains 1&2 can share information such as acceleration/deceleration, braking, location, etc., through the connection to stop at the same platform. 4. The server allows Train 2 to enter the platform. 5. For redundancy, Train 2 can add a connection with Train 1 through off-network before entering the platform. 6. Train 2 stops behind Train 1 at Platform A and opens its doors for passengers to get on and off. Train 1 starts to depart.
5eaf1b94c89939407995fc52470817be
22.890
5.2.4 Post-conditions
- Train 1 establishes a connection with the server at the next station while Train 1 has a connection with the server at the current station. - When the distance between Trains 1 and 2 becomes longer, Trains 1 and 2 stop sharing the information and disconnect from each other.
5eaf1b94c89939407995fc52470817be
22.890
5.2.5 Existing features partly or fully covering the use case functionality
TR 22.990[6] covers utilizing off-network and on-network communications at the same time and the traffic characteristic of off-network communications.
5eaf1b94c89939407995fc52470817be
22.890
5.2.6 Potential New Requirements needed to support the use case
5eaf1b94c89939407995fc52470817be
22.890
5.2.6.1 Requirements related to the Service layer
[PR 5.2.6-1] A single mobile FRMCS UE shall be able to connect to multiple edge servers simultaneously which are located along rail tracks. Note: The above requirement is intended to be included in Section 5.5 of TS22.282[8]. [Editor's note: The potential requirements for FRMCS UEs to identify the edge servers are FFS] [Editor's note: The potential requirements for edge servers to authorize the FRMCS UEs are FFS]
5eaf1b94c89939407995fc52470817be
22.890
5.2.6.2 Requirements related to the Transport layer
[PR 5.2.6-2] The FRMCS System shall support the following traffic characteristics of data transfer: Note: This table is intended to be included in Section 6.2 of TS 22.289[9]. Scenario (Note 5) End-to-end latency Reliability (Note 1) UE speed UE Relative Speed User experienced data rate Payload size (Note 2) Area traffic density Overall UE density Service area dimension (Note 3) Multiple trains' stops at the same platform (Korea, urban railway) ≤10 ms 99.9999% ≤100 km/h ≤50km/h ≤1Mb/s Small to large ≤ 1 Mb/s/km ≤ 5 (100m) ≤ 15 km along rail tracks including bad weather conditions (Note 4) NOTE 1: Reliability as defined in TS 22.289 sub-clause 3.1. NOTE 2: Small: payload ≤ 256 octets, Medium: payload ≤512 octets; Large: payload 513 -1500 octets. NOTE 3: Estimates of maximum dimensions. NOTE 4: Non-Line-of-Sight (NLOS) between UEs shall be supported NOTE 5: Off-network traffic characteristics are not addressed in this table since it can be covered by TR22.990. Table 5.2.6.2-1: Traffic characteristics for multiple trains' stops at the same platform
5eaf1b94c89939407995fc52470817be
22.890
6 Business communication applications related use cases
5eaf1b94c89939407995fc52470817be
22.890
6.1 Transportation convenience service for the passengers for the reduced mobility
5eaf1b94c89939407995fc52470817be
22.890
6.1.1 Description
In the Railway Smart Station, a transportation convenience service for the passengers with the reduced mobility can be feasible, such as a mobility service for the passengers to arrive at the desired destination. Figure 6.1.1-1. Example of transport convenience service for passenger with reduced mobility
5eaf1b94c89939407995fc52470817be
22.890
6.1.2 Pre-conditions
1. There exist feasible Mobile Intelligent Assistants in the Railway Smart Station, where the Mobile Intelligent Assistants support 3GPP system. 2. The Mobile Intelligent Assistants are operated under the central control system via 3GPP access. 3. There is at least one passenger with reduced mobility in the Railway Smart Station, where the weak passenger has difficulty moving toward the desired destination. 4. The passenger has an equipment supporting 3GPP access.
5eaf1b94c89939407995fc52470817be
22.890
6.1.3 Service Flows
1. A passenger with reduced mobility is reserved in advance, where a Railway Smart Station already knows that the passenger needs help to get to the desired destination. 2. Once the passenger enters the Railway Smart Station, one Mobile Intelligent Assistant stands by for mobile support to the desired destination. 3. The Mobile Intelligent Assistant takes the passenger to the desired place.
5eaf1b94c89939407995fc52470817be
22.890
6.1.4 Post-conditions
1. The Railway Smart Station traces and manages the route of movement of the passenger with reduced mobility.
5eaf1b94c89939407995fc52470817be
22.890
6.1.5 Existing features partly or fully covering the use case functionality
[R-5.11-001] The MCX Service shall support obtaining and conveying Location information describing the position of the MCX UE. [R-5.11-002] The MCX Service should support obtaining and conveying high accuracy Location information describing the position of the MCX UE. [R-5.11-002a] The MCX Service shall be able to provide a mechanism for obtaining high accuracy Location information by integrating position information from multiple external sources (e.g. magnetometers, orientation sensors, GNSS) [R-5.11-003] The MCX Service shall provide for the flexibility to convey future formats of Location information. [R-6.12-002] The MCX Service shall support conveyance of Location information provided by 3GPP location services. Note: Please refer to TS 22.280 V17.6.0 [7].
5eaf1b94c89939407995fc52470817be
22.890
6.1.6 Potential New Requirements needed to support the use case
[R-6.1.6-1] The MCX service shall be able to support obtaining and conveying location information as a scalable zone information describing the position of the MCX UE. Editor's note: This requirement is FFS.
5eaf1b94c89939407995fc52470817be
22.890
6.2 Smart kiosk of Railway Smart Station
5eaf1b94c89939407995fc52470817be
22.890
6.2.1 Description
A smart kiosk in the railway smart station provides various information to passengers, such as location information service with 3D or metaverse enabled station map, simple ticketing service, and other information providing services via interfacing smart station devices and systems, e.g. CCTV, sensors. The kiosk co-operates with a Mobile Intelligent Assistant such as robot, to support passengers if the kiosk operating system decides that it is necessary.
5eaf1b94c89939407995fc52470817be
22.890
6.2.2 Pre-conditions
A passenger has ticketed of a train. A passenger has a UE, which is a Railway Smart Station service enabled. A kiosk has a UE, which is a Railway Smart Station service enabled. A Mobile Intelligent Assistant has a UE, which is a Railway Smart Station service enabled. A passenger has made a permission to handle his/her identification to FRMCS, such as the smart station system.
5eaf1b94c89939407995fc52470817be
22.890
6.2.3 Service Flows
1. A passenger is standing in front of a smart kiosk to search his train's platform. 2. The passenger's UE make a proximity connection with the kiosk or vice versa by using 5G Off-network communication. 3. The passenger sends identification information of the passenger (e.g. customer id) or the train.(e.g. train number). 4. The kiosk shows the path to the platform of the train with 3D or metaverse enabled map of the smart station and transmits the information to the passenger's UE. 5. The passenger gets physical ticket or receipt from the kiosk and the passenger is a transport vulnerable, asks guidance to the kiosk by the Mobile Intelligent Assistant, a.k.a. guidance robot of the smart station. 6. The kiosk asks the location of the Mobile Intelligent Assistant to the FRMCS, and arranges the closest one to the passenger and bring the Mobile Intelligent Assistant the passenger, for example, in front of the kiosk. 7. The Mobile Intelligent Assistant comes to the passenger and confirms the identification information of the passenger by connecting the UE via 5G Off-network communication of the passenger. 7. The passenger gets on the Mobile Intelligent Assistant and moves to the platform.
5eaf1b94c89939407995fc52470817be
22.890
6.2.4 Post-conditions
The passenger gets on the train and enjoys his/her journey. The Mobile Intelligent Assistant is released from the passenger service and ready to organise other service.
5eaf1b94c89939407995fc52470817be
22.890
6.2.5 Existing features partly or fully covering the use case functionality
MCX location, identity and group management cover the related functionality of this use case.
5eaf1b94c89939407995fc52470817be
22.890
6.2.6 Potential New Requirements needed to support the use case
[PR-6.2.6-1] The 5G System shall support the proximity connections between the UE, the kiosk and the Mobile Intelligent Assistant. [PR-6.2.6-2] The 5G System shall be able to provide functions to handle the smart station information for making display 3D or metaverse enabled smart station map and generating path information by a kiosk.
5eaf1b94c89939407995fc52470817be
22.890
6.3 Multiple concurrent mobility services
5eaf1b94c89939407995fc52470817be
22.890
6.3.1 Description
In the Railway Smart Station, a transportation convenience service for the passengers with the reduced mobility can be feasible, such as a mobility service for the passengers to arrive at the desired destination. Figure 6.3.1-1. Example of multiple concurrent mobility services
5eaf1b94c89939407995fc52470817be
22.890
6.3.2 Pre-conditions
1. There exist feasible Mobile Intelligent Assistants in the smart station, where the Mobile Intelligent Assistants support 3GPP system. 2. The Mobile Intelligent Assistants are operated under the central control system via 3GPP access. 3. Each Mobile Intelligent Assistant supports corresponding mobility service. 4. There exist more than or equal to two mobility services, where each mobility service requires different location accuracy, and is supported by different Mobile Intelligent Assistant.
5eaf1b94c89939407995fc52470817be
22.890
6.3.3 Service Flows
1. Two different mobility services are initiated by the central control system. 2. A Mobile Intelligent Assistant#1 and a Mobile Intelligent Assistant#2 move along the predetermined path. Here, each path is characterized by the representative location of corresponding zone. 3. The Mobile Intelligent Assistant#1 and the Mobile Intelligent Assistant#2 moves along the representative location of blue and red zone. 5. Two different mobility services are completed by the Mobile Intelligent Assistant #1 and the Mobile Intelligent Assistant #2, where the completion time can be different.
5eaf1b94c89939407995fc52470817be
22.890
6.3.4 Post-conditions
1. Two different mobility services are supported in the Railway Smart Station.
5eaf1b94c89939407995fc52470817be
22.890
6.3.5 Existing features partly or fully covering the use case functionality
[R-5.11-001] The MCX Service shall support obtaining and conveying Location information describing the position of the MCX UE. [R-5.11-002] The MCX Service should support obtaining and conveying high accuracy Location information describing the position of the MCX UE. [R-5.11-002a] The MCX Service shall be able to provide a mechanism for obtaining high accuracy Location information by integrating position information from multiple external sources (e.g. magnetometers, orientation sensors, GNSS) [R-5.11-003] The MCX Service shall provide for the flexibility to convey future formats of Location information. [R-6.12-002] The MCX Service shall support conveyance of Location information provided by 3GPP location services. Note: Please refer to TS 22.280 [7].
5eaf1b94c89939407995fc52470817be
22.890
6.3.6 Potential New Requirements needed to support the use case
[PR-6.3.6-1] The MCX service shall support obtaining and conveying location information describing the positions of each MCX UE with different location accuracy simultaneously. Note: The above requirement is intended to be included in Sections 5.11 and 6.12 of TS 22.280 [7].
5eaf1b94c89939407995fc52470817be
22.890
6.4 Operation of platform screen doors
5eaf1b94c89939407995fc52470817be
22.890
6.4.1 Description
For the safety of a platform, there is the need of screen doors placed on the edge of the platform to prevent dangerous situation when the train approaching the platform and passengers getting on and off the train. The screen doors are opened before the opening of train doors, and are closed after the closing of train doors. If there is emergency situation, the designated CCTVs are controlled to aim the emergency spot and rely the video of the spot to the train driver's monitors and the railway station staff's monitors including their UEs to assist their actions to cover the situation. Figure 6.4.2-1 Operation of train and screen doors with CCTVs in a platform
5eaf1b94c89939407995fc52470817be
22.890
6.4.2 Pre-conditions
Some CCTVs are pre-designated to aim each part of the platform in case of emergency. The CCTVs, train driver, the staffs of the station are pre-defined as a group for the emergency. In the train, a Trainborne System is to control the train doors. The Screen Door Controller handles the screen doors in the platform of the station. Between the system and the doors, a synchronisation is maintained to open and close the train's doors and the screen doors at the same time. During the train approaching the platform of station, the Trainborne System and Screen Door Controller checking that the train stops in the right place and aligns the doors well.
5eaf1b94c89939407995fc52470817be
22.890
6.4.3 Service Flows
1. The CCTVs in the train and in the platform, start video-recording each doors and displays the videos on the Train Driver's monitors. If the Train Driver finds abnormal status during monitoring the videos from the CCTVs, the Train Driver could open or close all the doors in the platform and the train manually. 2. Trainborne System notices the Screen Door Controller to open the train doors. 3. Screen Door Controller announces the screen door opening to the passengers in the platform via displays that are attached each screen doors and/or speakers of the train and the platform. 4. Screen Door Controller opens screen doors and notices to the Trainborne System. 5. Trainborne System makes announcement for the train door opening to the passengers in the train. 6. Trainborne System opens the train doors. 7. During a pre-defined time, train door sensors and screen door sensors detect passenger's moving. During the passengers are getting on and off the train, passengers could keep the doors open through pushing an emergency button on the doors. In this case, a notice is sent to the Train Driver and the staffs of the station in the pre-defined group to let them know this situation, and the designated CCTVs are controlled to aim the emergency spot with the location information of the emergency button to assist the Train Driver and the staff to figure out the situation. 8. The time is up, Trainborne System and Screen Door Controller announce to passengers the doors are closing. During the doors are closing, the sensors in the doors detected passengers or obstacles such as bag and umbrella in the door area, the doors are stopped closing and re-opened by the Trainborne System and Screen Door Controller automatically, and the Train Driver is noticed the situation by the Trainborne System. CCTVs are aimed the location of the doors, video-records the situation and display it on the Train Driver's monitors. The Train Driver could select a CCTV from the list of CCTVs in the platform, handles the CCTV to get closer video manually. 9. Trainborne System notices the train doors closing to the Screen Door Controller and closes the train doors. 10. Screen Door Controller closes the screen doors and notices the completion of closing door to the Trainborne System. 11. Trainborne System notices ready-to-go to the Train Driver.
5eaf1b94c89939407995fc52470817be
22.890
6.4.4 Post-conditions
All the doors are closed and the train moves to the next station.
5eaf1b94c89939407995fc52470817be
22.890
6.4.5 Existing features partly or fully covering the use case functionality
The group management are fully covered by 5G system and MCX framework. Note: Please refer to TS 22.281 V17.0.0. [R-5.1.3.1.2-001] The MCVideo service shall provide a mechanism for an MCVideo user to remotely control a camera on another MCVideo UE subject to relevant authorization. [R-5.1.9.2.2-001] The MCVideo service shall provide a mechanism for an authorized MCVideo User to push video to another MCVideo User. [R-5.1.9.2.2-002] The MCVideo service shall provide a mechanism for an MCVideo administrator to authorize an MCVideo user to push a video to another MCVideo user. [R-5.1.9.2.2-008] The MCVideo service shall provide a mechanism for an MCVideo User to suspend and to resume receiving an incoming video stream from an MCVideo push. Note: Please refer to TS 22.280 V18.2.0 [7]. [R-5.1.1-002] The MCX Service shall provide a mechanism by which an MCX UE makes a MCX Service group transmission to any MCX Service Group(s) for which the current MCX User is authorized. [R-5.1.1-006] The MCX Service shall provide a mechanism for a dispatcher or authorized user to configure which content source shall be able to transmit the content to an MCX Service Group (e.g. video cameras near an incident). [R-5.21.1.2-004] The MCX Service shall provide a mechanism for an MCX User to request an authorized MCX User (e.g., a dispatcher) to send an MCX Communication (e.g., video or data) to the MCX UE (downlink pull).
5eaf1b94c89939407995fc52470817be
22.890
6.4.6 Potential New Requirements needed to support the use case
[PR-6.4.6-1] The FRMCS shall be able to provide a mechanism to trigger an emergency alert based on a combination of UE location (e.g. the location of the specific platform door / train door) and application-generated trigger (e.g. train door did not close properly due to a blockage).
5eaf1b94c89939407995fc52470817be
22.890
6.5 Automatic monitoring of Railway Smart Station
5eaf1b94c89939407995fc52470817be
22.890
6.5.1 Description
The monitoring of a railway station is a hard work. It should be made in 24 hours a day, 7 days a week. It is carried out through dozens of CCTVs, a controller could not check all the CCTVs at a moment. To assist monitoring CCTV, a AI system gives help to the controller. The AI system is a part of Railway Smart Station services, and it has live streaming video input from CCTVs in the Railway Smart Station. The AI system inspects the input video streams, finds abnormal situations such as illegal riding, neglected wandering of suspicious object, unauthorized entry, or user falls from the platform. If it detects abnormal situation, makes notice of warning to the Station Staff and Control office. Figure 6.5.1-1 Use case of Automatic Monitoring that covers emergency situation
5eaf1b94c89939407995fc52470817be
22.890
6.5.2 Pre-conditions
Some CCTVs are pre-designated to aim each part of the station in case of emergency. A system of AI is trained to provide automatic monitoring functions for the railway smart station. Some abnormal cases are pre-defined in the system.
5eaf1b94c89939407995fc52470817be
22.890
6.5.3 Service Flows
1. The CCTVs in the station provide live streaming videos on the situation of the station such as platforms of the station. 2. The AI system looks at the video data from the dozens of CCTVs. 3. An abnormal situation is occurred. A passenger has fallen from the platform. 4. The AI system makes an alarm to the controller of the station to let him know the situation. The ​​system also sends a notification alarm to station staff who are close to the place where the situation occurred. 5. The staff arrives at the accident site, rescues the passenger, and organizes the surrounding situation. 6. The controller is aware of the situation and contacts the train to prevent it from entering the platform. 7. The AI system records the video, call history, and actions taken in the process of handling abnormal situations as data of the future learning.
5eaf1b94c89939407995fc52470817be
22.890
6.5.4 Post-conditions
Passengers are rescued, circumstances are cleared up, and trains are allowed to enter the platform. The data recorded by the AI ​​system is later used in audits for handling the case.
5eaf1b94c89939407995fc52470817be
22.890
6.5.5 Existing features partly or fully covering the use case functionality
The group management are fully covered by 5G system and MCX framework. Note: Please refer to TS 22.280 V18.2.0 [7]. [R-5.11-009] The MCX Service shall provide a means for an MCX UE to send a Location information update whenever a trigger condition is satisfied (e.g., initial registration, distance travelled, elapsed time, cell change, tracking area change, PLMN change, MCX Service communication initiation). [R-6.15.4-004] The MCX Service shall provide a mechanism for a Mission Critical Organization to log at least the following metadata per communication: depending on service this may include; start time, date, MCX User ID, functional alias(es), MCX Group ID, Location information of the transmitting Participant, end time or duration, end reason, type of communication (e.g., MCX Service Emergency, regroup, private) and success/failure indication. Note: Please refer to TS 22.281 V17.0.0. [R-5.1.3.3.2-001] The MCVideo service shall provide a mechanism for an authorised MCVideo User to remotely start and stop local recording of video. [R-5.1.3.3.2-002] The MCVideo service shall provide a mechanism for an authorised MCVideo User to remotely set triggers for automatic commencement of video transmission to authorised MCVideo Users; such triggers to include motion detection, time of day, face recognition, licence plate recognition, location and speed.
5eaf1b94c89939407995fc52470817be
22.890
6.5.6 Potential New Requirements needed to support the use case
No new potential requirements identified.
5eaf1b94c89939407995fc52470817be
22.890
7 Potential Consolidated Requirements
5eaf1b94c89939407995fc52470817be
22.890
7.1 Introduction
The requirements below refer to a "Railway Smart Station Services", which is acting as an application a FRMCS and the outer systems. The potential consolidated requirements are mainly focusing the 5G Network characteristics and the interfaces between FRMCS/MCX Functions and Railway Smart Station Services as the application of FRMCS and outer system of 3GPP. Figure 7.3-1 Scope of the Potential Consolidated Requirements
5eaf1b94c89939407995fc52470817be
22.890
7.2 Functional aspects
Table 7.2-1 Functional Aspects Consolidated Requirements CPR # Consolidated Potential Requirement Original PR # Comment CPR 7.2-1 The MCX service shall support obtaining and conveying MCX UE location information describing the positions of each MCX UE with different location accuracy simultaneously. PR 6.3.6-1 intended to be included in clauses 5.11 and 6.12 of TS 22.280 CPR 7.2-2 The FRMCS shall be able to provide a mechanism to trigger an emergency alert based on a combination of UE location (e.g. the location of the specific platform door / train door) and application-generated trigger (e.g. train door did not close properly due to a blockage) PR-6.4.6-1 intended to be included in a new clause of TS 22.280
5eaf1b94c89939407995fc52470817be
22.890
7.3 Performance
Table 7.3-1: KPIs for Railway Smart Station Services Scenario (Note 5) End-to-end latency Reliability (Note 1) UE speed UE Relative Speed User experienced data rate Payload size (Note 2) Area traffic density Overall UE density Service area dimension (Note 3) Multiple trains' stops at the same platform (Korea, urban railway) ≤10 ms 99.9999% ≤100 km/h ≤50km/h ≤1Mb/s Small to large ≤ 1 Mb/s/km ≤ 5 (100m) ≤ 15 km along rail tracks including bad weather conditions (Note 4) NOTE 1: Reliability as defined in TS 22.289 sub-clause 3.1. NOTE 2: Small: payload ≤ 256 octets, Medium: payload ≤512 octets; Large: payload 513 -1500 octets. NOTE 3: Estimates of maximum dimensions. NOTE 4: Non-Line-of-Sight (NLOS) between UEs shall be supported NOTE 5: Off-network traffic characteristics are not addressed in this table since it can be covered by TR22.990. Note: This table is intended to be included in clause 6.2 of TS 22.289 [9].
5eaf1b94c89939407995fc52470817be
22.890
8 Conclusions and Recommendations
This technical report collects use cases and derives potential requirements related to RAILSS. This TR also clarifies whether the identified requirements are supported by the current 5G system or whether they are new potential requirements. The consolidated potential requirements that are related to KPIs will be considered to be added in TS22.289. Most of other requirements may target MCX specifications, such as TS22.280 [7], TS22.281, and TS22.282 [8]. Annex A: Change history Change history Date Meeting TDoc CR Rev Cat Subject/Comment New version 09/2022 SA#97e SP-220935 Raised to v.1.0.0 by MCC, solving missing Figure 6.5.1-1 (taken from S1-222357) 1.0.0 09/2022 SA#97e - Raised to v.19.0.0 by MCC following SA one-step approval 19.0.0
3239937b55fdc406849fa93870986364
22.916
1 Scope
The present document describes use cases and aspects related to efficient communications service and cooperative operation for a group of service robots including: ◦ exposure of information between application layer and communications layer; ◦ support of on-demand high priority communications; ◦ KPIs for large-scale group operation scenarios; ◦ support of scalable and efficient use of communication resources; ◦ requirements related to media applications specific for service robots; and ◦ aspects related to security, privacy and charging that are relevant to support stable operation of service robots. This document also describes the existing service requirements and potential correlation with other studies.
3239937b55fdc406849fa93870986364
22.916
2 References
The following documents contain provisions which, through reference in this text, constitute provisions of the present document. - References are either specific (identified by date of publication, edition number, version number, etc.) or non‑specific. - For a specific reference, subsequent revisions do not apply. - For a non-specific reference, the latest version applies. In the case of a reference to a 3GPP document (including a GSM document), a non-specific reference implicitly refers to the latest version of that document in the same Release as the present document. [1] 3GPP TR 21.905: "Vocabulary for 3GPP Specifications". [2] 3GPP TS 22.104: "Service requirements for cyber-physical control applications in vertical domains; Stage 1". [3] 3GPP TS 22.261: "Service requirements for the 5G system; Stage 1". [4] 3GPP TS 22.263, "Service requirements for video, imaging and audio for professional applications (VIAPA); Stage 1". [5] Next G Alliance Report: 6G Applications and Use Cases, May 2022; https://www.nextgalliance.org/wp-content/uploads/dlm_uploads/2022/07/NGA-Perspective-Brochure-V6.pdf [6] K.-D. Lee, "A Smart Network of Service Robots: Technical Challenges and Design Considerations," IEEE Communications Magazine, pp. 28-34, August 2021. [7] K.-D. Lee and C. Gray-Preston, "Everyday Living Assisted by 6G Applications and Solutions," IEEE Wireless Communications Magazine, October 2022. [8] W. Zhuang, Y. Shen, L. Li, C. Gao and D. Dai. “Develop an Adaptive Real-Time Indoor Intrusion Detection System Based on Empirical Analysis of OFDM Subcarriers.” Sensors (Basel, Switzerland). 2021 Mar;21(7):2287 [9] 1872-2015 IEEE Standard for Ontologies for Robotics and Automation [10] 2755-2017 IEEE Guide for Terms and Concepts in Intelligent Process Automation [11] http://dictionary.ieee.org [12] NIST https://www.nist.gov/system/files/documents/el/isd/ks/ALFUS-BG.pdf [13] SAE AS4D Committee [14] NIST, Autonomy Levels for Unmanned Systems (ALFUS) Framework; https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=823618 [15] Multimodal Fusion for Objective Assessment of Cognitive Workload: A Review, September 2019 IEEE Transactions on Cybernetics PP(99):1-14, DOI:10.1109/TCYB.2019.2939399 [16] Data Fusion and IoT for Smart Ubiquitous Environments: A Survey, April 2017 IEEE Access PP(99):1-1, DOI:10.1109/ACCESS.2017.2697839 [17] MPEG Use cases and requirements for Video Coding for Machines; https://www.mpeg.org/wp-content/uploads/mpeg_meetings/138_OnLine/w21545.zip [18] 3GPP TR 22.886: "Study on enhancement of 3GPP support for 5G V2X services”. [19] 3GPP TR 26.926: "Traffic Models and Quality Evaluation Methods for Media and XR Services in 5G Systems [20] A.-L. Kim, et al., "Development of change detection algorithm using high resolution SAR complex image," in Proc. 2015 IEEE Asia-Pacific Conference on Synthetic Aperture Radar (APSAR), Online: https://ieeexplore.ieee.org/abstract/document/7306330 [21] G.W. Morgenthaler, et al, "Feasibility of Using a Miniature-UAV Remote-Sensing/Precision-Agriculture System (MINI-UAV RSPA SPACE SYSTEM) to Increase Crop Yields, Lower Costs, Save Water, and Reduce Pollution of Air, Soil, and Water," Aerospace Research Central Nov. 2012. Online: https://arc.aiaa.org/doi/abs/10.2514/6.IAC-06-D3.2.09 [23] Juliet J. Lee, "Remote Sensing and Artificial Intelligence for Wildlife Conservation – A Survey," WPAC, 2023. Online: https://wvwpac.wixsite.com/westview/post/remote-sensing-and-artificial-intelligence-for-wildlife-preservation-a-survey [24] E. Felemban, et al., "Underwater Sensor Network Applications: A Comprehensive Survey," International Journal of Distributed Sensor NetworksVolume 11, Issue 11, November 2015. Online: https://journals.sagepub.com/doi/epub/10.1155/2015/896832 [25] Ki-Dong Lee, “An efficient real-time method for improving intrinsic delay of capacity allocation in interactive GEO Satellite networks,” IEEE Transactions on Vehicular Technology, vol 53, no. 2, pp. 538 – 546, March 2004. Online: https://ieeexplore.ieee.org/document/1275718 [26] ITU-R Future technology trends towards 2030, Available online: https://www.itu.int/en/ITU-T/Workshops-and-Seminars/2023/0724/Pages/default.aspx [27] Statista, https://www.statista.com/topics/1143/mining/ [28] 3GPP TS 22.847: "Study on supporting tactile and multi-modality communication services; stage 1 (Release 18)", 2022-03. [29] Network-Enabled Robotic and Autonomous Systems, NextG Alliance, ATIS. May 2023. Available online: https://www.nextgalliance.org/white_papers/network-enabled-robotic-autonomous-systems/ [30] 3GPP TR 22.837, Study on Integrated Sensing and Communication – Stage 1. [31] 3GPP TR 22.856, Study on Localized Mobile Metaverse Services – Stage 1.
3239937b55fdc406849fa93870986364
22.916
3 Definitions of terms, symbols and abbreviations
3239937b55fdc406849fa93870986364
22.916
3.1 Terms
For the purposes of the present document, the terms given in 3GPP TR 21.905 [1] and the following apply. A term defined in the present document takes precedence over the definition of the same term, if any, in 3GPP TR 21.905 [1]. NOTE: Cited from IEEE 1872-2015 [9] automated robot: A role for a robot performing a given task in which the robot acts as an automaton, not adapting to changes in the environment and/or following scripted plans.. fully autonomous robot: A role for a robot performing a given task in which the robot solves the task without human intervention while adapting to operational and environmental conditions. orientation measure: Essentially a measure (Measure in SUMO) attributed to a (physical) object (Object in SUMO) concerning information regarding where the object is pointing to in relation to the reference object of the orientation coordinate system. orientation region: Defines a region or interval orientation in relation to a reference object (Object in SUMO). For instance, the “south” interval of a compass constitutes an orientation region in the one-dimensional, circular coordinate system of the compass. Eventually, position regions and orientation regions are referred by similar words. For instance, it is valid to say that a robot is at the north position, facing north. The former relates to a position region, i.e., the north region of a given country; the later relates to an orientation region, i.e., the orientation interval around north on the compass. orientation value: A value in a coordinate system denoting a specific orientation. Orientation values in one coordinate system can be mapped to other coordinate systems. An example of use of orientation value is in “the robot is oriented 54° in relation to the reference object.” remote-controlled robot: A role for a robot performing a given task in which the human operator controls the robot on a continuous basis, from a location off the robot, via only her/his direct observation. In this mode, the robot takes no initiative and relies on continuous or nearly continuous input from the human operator. robot actuating part: A role for devices (Device in SUMO) that allow for the robot to move and act in the surrounding environment. robot communicating part: A role for devices (Device in SUMO) that serves as instruments in a robot, robot communication process or a human-robot communication process by allowing the robot to send (or receive) information to (or from) a robot or a human. robot group: A group (Group in SUMO) of robots organized to achieve at least one common goal. robot processing part: A role played by processing devices which allows the robot to process information. robot sensing part: A role played by any measuring device (MeasuringDevice in SUMO) that allows the robot to acquire information about its environment. robot: An agentive device (Agent and Device in SUMO) in a broad sense, purposed to act in the physical world in order to accomplish one or more tasks. In some cases, the actions of a robot might be subordinated to actions of other agents (Agent in SUMO), such as software agents (bots) or humans. A robot is composed of suitable mechanical and electronic parts. Robots might form social groups, where they interact to achieve a common goal. A robot (or a group of robots) can form robotic systems together with special environments geared to facilitate their work. semi-autonomous robot: A role for a robot performing a given task in which the robot and a human operator plan and conduct the task, requiring various levels of human interaction. teleoperated robot: A role for a robot performing a given task in which a human operator, using sensory feedback, either directly controls the actuators or assigns incremental goals on a continuous basis, from a location off the robot. A teleoperated robot will complete its last command after the operator stops sending commands, even if that command is complex or time-consuming. tandem sub-network: a group of robots (as a UE) that are serially connected to each other for a traffic session. NOTE: According to this definition, a tandem sub-network can have a ring. For example, robot A is connect to robot C via an intermediate robot B for a traffic session, it is said that these three robots have formed a tandem sub-network. If robot A has two different paths to robot C for a single traffic session, one via robot B and the other via robot B2, these four robots are also said to have formed a tandem sub-network. Editor’s Note: The above definition is FFS.
3239937b55fdc406849fa93870986364
22.916
3.2 Abbreviations
For the purposes of the present document, the abbreviations given in 3GPP TR 21.905 [1] and the following apply. An abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in 3GPP TR 21.905 [1]. ALFUS Autonomy Levels for Uncrewed Systems ASR Automatic Speech Recognition NLP Natural Language Processing NLU Natural Language Understanding ORA Ontology for Robotics and Automation R&A Robotics and Automation RCS Rich Communication Services SOBOT Service Robot SUMO Suggested Upper Merged Ontology TTS Text To Speech VoLTE Voice over LTE VoNR Voice over NR
3239937b55fdc406849fa93870986364
22.916
4 Overview
The present document addresses the existing and expected roles of communications in supporting operational models of service robots, focusing on two main collaboration modes within a group operation model. Maintaining high communication availability is crucial for optimal robot group performance. The passage also emphasizes the importance of timely event-related information sharing, mentioning relevant studies and normative requirements. The document aims to analyse and document 3GPP 5G system support for groups of service robots in various usage scenarios, considering spectrum usage and diverse requirements from the robotics industry. Some of the features and aspects related to communication support for robot applications and group operations (such as those described in TS’s 22.186, 22.125, 22.261, and 22.263) are also listed as 'Related existing service requirements' in each use case. Additionally, some related studies and deployment scenarios are summarized in Clause 6. In this overview, various applications of service robots are discussed across different scenarios: - Building 3D Maps in Unstructured Environments (5.1): Energy-efficient robots collaborate to create 3D maps in areas such as cleaning, disinfection, and agriculture. They adapt actions based on environmental conditions, emphasizing accuracy while optimizing computing and communication resources. - Enhancing Security Protection (5.2): Robots and security staff collaborate for patrolling, target identification, tracking, and alarm reporting in specific areas. Synchronized transmission ensures real-time response to security events, improving overall security. - Smart Cooperation for Data Integration (5.3): Robots collaborate to build an information set through data and sensor fusion, sharing real-time data and deciding on raw or pre-processed data transmission based on fusion levels. Applications explored include diverse underwater sensor networks. - Service Robots with Visual Sensors (5.4): Robots equipped with visual sensors focus on indoor video surveillance and intelligent transportation. They detect security events, recognize objects, and share processed data for enhanced situational awareness. Machine interpretation and optimized bandwidth usage are prioritized. - Service Robots in Continuing Care Retirement Communities (5.5): Robots assist in crime prevention, medical emergencies, natural language processing, and gesture recognition in care communities. They patrol, respond to emergencies, control smart home appliances, and deliver groceries, enhancing safety and support for residents. - Voicebots for Spoken Conversations (5.6): Voicebots aid individuals, especially the elderly, in accessing digital services and information. They operate through speech-to-text processing, audio sample transmission, and voice calls, ensuring real-time, natural conversation experiences. - Geo-surface Sensing and Multi-access Edge Computing (5.7): Geo-surface sensing applications generate vast data, requiring efficient preprocessing. Multi-access Edge Computing (MEC) enhances service robots' performance and efficiency, enabling innovative applications in navigation, object recognition, and human-robot interaction. - Robots in Mining (5.8): Robots perform diverse roles in the mining industry, including exploration, drilling, hauling, inspection, maintenance, and environmental monitoring. They operate in extreme conditions, enhancing safety and efficiency. Enhanced communications and sensing networks are crucial for underground mining operations. Each scenario highlights specific challenges and technologies faced by service robots in different contexts, ranging from security and safety applications to data integration and mining operations.
3239937b55fdc406849fa93870986364
22.916
5 Use cases
3239937b55fdc406849fa93870986364
22.916
5.1 Online cooperative high-resolution 3D map building
3239937b55fdc406849fa93870986364
22.916
5.1.1 General description
This use case considers a low-energy (or energy-efficient) cooperation scenario to collaboratively build a 3D map among a group of multiple robots, aiming at usage for unstructured settings, such as enterprise building cleaning, preparation for disinfection of large-scale building, and automation for agriculture. With cooperation among multiple robots gathering measurement data, it would be possible to either save energy or build a better quality outcome, or to attain both [5-7]. NOTE 1: Some aspects related to “automation for agriculture” can also be studied with a combined scenario of ground mobility and aerial mobility. NOTE 2: The meaning of “map” in this use case is not necessarily limited to geographic appearance but it may also include still life objects that are useful or essential for robots working in an irregular and/or unstructured setting. A group of service robots that are equipped with capabilities of multi-dimensional ambient sensing, computing (standalone and/or via compute fabric), federation in learning and model building, and 3GPP subscription-based communication, are in cooperation for a single joint project. The availability of communication service to/from edge (or cloud) is threefold: not available, temporarily unavailable, or available (for certain period of time; positive interpretation although the term “available” does not mean “permanently available”). NOTE 3: This use case is mostly focused on ProSe-based operation (also, referred to as “ProSe-enabled”) with partial or intermittent connection to NG-RAN (or to edge server via NG-RAN). The edge (a server), if available for one or more of these service robots, will assist them to alleviate their computational burdens (that are or are not within the scope of 3GPP), giving rise to a demand of accessing service-specific network slice(s) or other forms of network resources with certain performance requirements. An operator of robotic applications starts operating a group of service robots which are UEs. These service robots discover each other and share their capabilities. NOTE 4: For each service robot (UE), capabilities include certain characteristics such as types of supported RATs (e.g., NR, E-UTRA, or non-3GPP access technology) and information that are not within the scope of communications layer, such as remaining battery life. All or some of these service robots form a working group (with one or more leader robots) and starts communicating. Member robots send measurement data to a leader robot so that the leader robot can perform the next step to build a 3D map. NOTE 5: The roles of leader robot(s) include coordination required for the operation of the working group of service robots, such as acting as sync master for other robots (sync devices) within the working clock domain. These service robots scan environmental parameters, including 3GPP service availability, and collaboratively decide which operational scenario they should choose (i.e., Uu-based or ProSe-based, also referred to as ProSe-enabled). Each service robot in the working group walks in coordination with each other, forming a gregarious cluster (i.e., distance between any pair is not unnecessarily far, degrading the performance of map building outcome). Each service robot is exposed to uneven surface along its trajectory (e.g., signal angle measurement is not static, unpredicted loss of measurement accuracy level is likely to happen). Depending on the accuracy level of 3D map at certain spot of the job site and decision made by the leader robot(s), the application layer of the leader robot requests to adjust the clock synchronisation target value within the clock synchronisation budget. While moving along, one member robot, say robot A, faces some issue, resulting an unexpected drop in the moving speed. Member robot A has already predicted this issue beforehand: its follow-up actions include reporting this information to a leader robot and marking time stamp on the measurement data with this outlier situation. It is up to member robot A whether or not, to send the measurement data with outlier indication to a leader robot. It is up to the leader robot whether or not, to use the received data with outlier indication, if received from member robot A, for 3D map building. Later, member robot A gets a little bit away from the gregarious cluster, leading to a temporary loss of connection to a relay UE robot (or to gNB in Uu-based scenario). Member robot A promptly resumes a connection. Fig. 5.1.1-1: Inter-robot operation example when a network of service robots that have ambient intelligence (e.g., intra-robot operation) are in cooperation for a joint project [5,7]. The working group of service robots can build up 3D map with only necessary level of accuracy so that they do not have to consume computing and communication resources to build up a 3D map of an area that is overly accurate. Also, for an important area, they could adjust the level of accuracy. They could prevent potential noise factors that could have contributed to the quality of 3D map with the help of prediction-based indication. A robot that has instantaneously lost a connection can resume a connection very promptly and send time-critical information to other member(s).
3239937b55fdc406849fa93870986364
22.916
5.1.2 Related existing service requirements
Clock synchronisation: 3GPP TS 22.104 [2] - clause 5.6.1 Clock synchronisation service level requirements - clause 5.6.2 Clock synchronisation service performance requirements - clause 7.2.3.2 Clock synchronisation requirements Timing resiliency: 3GPP TS 22.261 [3] - clause 6.36.2 General requirements to ensure timing resiliency - clause 6.36.3 Monitoring and reporting - clause 6.36.4 Exposure Multi-path relay: 3GPP TS 22.261 [3] - clause 6.9.2.1 support of a traffic flow of a remote UE via different indirect network connection paths Positioning: 3GPP TS 22.261 [3] - clause 7.3.2 High accuracy positioning performance requirements (see also clause 5.7.1 of 3GPP TS 22.104 for Factory of the Future scenario) Service continuity: 3GPP TS 22.263 [4] - clause 5.5 Service continuity
3239937b55fdc406849fa93870986364
22.916
5.1.3 Challenges and potential gaps
The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable. [CPG-5.1.3-001] 5G system is expected to be able to provide a means to ensure a very high accuracy level of clock synchronization to support that a group of service robots can build up 3D map collaboratively (i.e., synchronization among service robots within a collaborating group and synchronization among the multiple sources related to the respective service robots) in which the accuracy level is required by the applications layer. NOTE 1: Clock synchronization accuracy is provided by 5G system in order to support applications that require time-sensitive communication. Depending on what is required by the robot application layer a higher accuracy level is expected. In scenarios where a group of robots connected to each other via multi-hop, the current accuracy level of clock synchronization (e.g., Cooperative carrying – fragile work pieces; (A.2.2.5), Table 7.2.3.2-1, 3GPP TS 22.104) [2] might not be sufficient. It is also discussed that a smaller clock synchronization budget might be necessary to support collaborative robots (cobots) scenarios where there are further network elements in the robots behind the UE. [CPG-5.1.3-002] 5G system is expected to be able to ensure the integrity and validity of clock synchronization for a designated length of time when a group of service robots are in ProSe-based operation outside the coverage area served by NG-RAN. NOTE 2: The time length is dependent upon the type of project and is required by a service robot’s application. [CPG-5.1.3-003] If the integrity and validity of clock synchronization cannot be ensured for a designated length of time, the 5G system is expected to notify the application. [CPG-5.1.3-004] 5G system is expected to be able to provide a means for UE(s) to adjust the accuracy level of clock synchronisation. [CPG-5.1.3-005] 5G system is expected to be able to provide a means to share the accuracy level and integrity-related information of clock synchronization with the cloud (in Uu-based scenario) or with the leader robot (in ProSe-based scenario). [CPG-5.1.3-006] 5G system is expected to be able to provide a means to resume the connection when an ongoing connection is disrupted (e.g., due to radio link failure b/w a robot and the communicating counterpart) within a very short period of time, required by the applications layer. NOTE 3: The current time period for securely reconnecting is required to be less than 1s. Robotic applications that perform critical roles may require much shorter time period: <100ms for critical, <10ms for highly critical. NOTE 4: The requirement is not intended to cover scenarios where the service robot experiencing more severe levels of disruptions, such as 3GPP registration state changes. [CPG-5.1.3-007] 5G system is expected to be able to provide a means to allow a member robot that has predicted communication disruption or measurement failure to disseminate necessary information, which is required by the applications layer, to one or more destinations within a very short period of time required by the applications layer. NOTE 5: The required time period is application dependent (e.g. <100ms for moderate level of disruption, <10ms for critical level of disruption). The above CPG is intended for both RRC Connected mode and RRC-Inactive mode. It is not intended that the above CPG is applicable to RRC Idle mode. [CPG-5.1.3-008] Based on the request from the application (e.g., from an application of the leader robot in a robot group, or from an application in the cloud server), 5G system is expected to be able to determine a specific area in which the system could adjust the accuracy level of clock synchronization, and to be able to provide a means to expose the network capabilities (e.g. capability of monitoring “clock synchronization accuracy level”) and the monitoring results (e.g. the accuracy level measured in the area of interest) to the application. NOTE 6: Possible scenarios regarding the referred robot group include a group of “automated robots”, a group of “fully autonomous robots”, a group of “tele-operated robots”, and a group that consists of a suitable combination of those kinds of robots. NOTE 7: See Annex B for an example scenario for robot applications to adjust the accuracy level of clock synchronization provided by 5G. The accuracy level that 5G system provides might differ depending on their availability and the level that the application layer requires might differ depending on the time and environment their task is being performed. NOTE 8: In 3GPP TS 22.104 [2], Table 5.6.2-1 presents various scenarios that require different clock synchronization accuracy levels. The above CPG (i.e., [CPG-5.1.3-008]) is intended for a specific application or task of a robot (or a group of robots) that may need to adjust the clock synchronization accuracy level due to changes in application or due to changes in 5G system’s capability.
3239937b55fdc406849fa93870986364
22.916
5.2 Real-time cooperative safety protection
3239937b55fdc406849fa93870986364
22.916
5.2.1 General description
This use case considers the collaboration between security staff and robots to complete security protection of a certain geographical area, including patrolling based on the configured route, target identification, target tracking, intelligent detection, alarm report, etc. The security protection task requires real-time information sharing among robots, security staff and remote security controller. In addition, the decision or adjustment of security protection schemes, which may be made by a leader robot, a security person or a remote security controller, also needs to be received and executed by all the participants (e.g. robots or security staff) synchronously. Through the real-time collaboration among robots, security staff and remote security controller, the performance and efficiency of security protection can be improved. The real-time cooperative safety protection also can reduce the labour intensity and work risks of security staff, as well as the cost (e.g. the number of security staff). For example, one of the most important features of smart factory is safety production solution. Robots play an important role in smart factory. A group of robots equipped with cameras and sensors are used to collect and report real-time information periodically according to the configured route. The security protection decision maker can be a leader robot, a security person or a remote security controller. Based on the latest global information, the decision maker determines whether there is a security event and how to respond. The potential events of security protection contain intrusion detection, fall detection, smoke and flame identification, critical access occupancy identification, helmet identification, etc. Fig. 5.2.1-1: Real-time cooperative safety protection A group of robots equipped with cameras, sensors and 3GPP-based communication capabilities (e.g. direct network connection, indirect network connection or both) cooperatively work together to complete security protection of a certain geographical area. According to the complexity of the security protection task, the intelligence level of robots and the quality of communication service, a security person equipped with a 3GPP-based UE or a remote secure controller may be needed. A security protection task is configured and started, which includes patrolling based on the configured route, target identification, target tracking, intelligent detection, alarm report, etc. A leader participant is chosen according to the complexity of the security protection task, the intelligence level of robots, the quality of communication service, etc. The leader participant is in charge of collecting the latest information from other participants and determine the control information for other participants based on the latest global information. The leader participant can be a security person, a leader robot or a remote security controller. These three cases can be switched or used collaboratively. Taking a security person as the leader participant for example, service flows are described as follows. 1. Robots and the UE of security staff in the same security protection task discover each other and share their capabilities. The capabilities include both communication capabilities and service capabilities (e.g. sensor type, leader participant capability). 2. Based on the initial configuration, the leader participant (the UE of a security person) receives the latest information (e.g. location, target characteristic update, camera information, channel state information) from other participants (e.g. robots, other security person) periodically. The time period is about 1ms to 100ms depending on the security protection task. For example, the sampling rate of indoor intrusion detection is 50Hz (20ms) in [8]. When the camera information of each UE is video steam, data packets from a frame or video slice [19] are relevant. In some implementations all the data packets are needed, if some data packets are missing other data packets are useless. Therefore, to save the resources, when some data packets of a frame or video slice cannot be transmitted on time, other data packets from the same frame or video slice shall be discarded. 3. A single robot can only provide limited partial information, which is generally not enough for making decision. One of the potential processing methods is shown in Fig. 5.2.1-2, after the leader participant receives all the participants' data arrived in the first decision period (e.g., in the blue block, the leader participant processes all the data in the blue block to generate the global information. when some data packets of a participant cannot be transmitted on time, other data packets of other participants with the same time stamp shall be discarded to save network resources. Based on the global information, the leader participant decides whether there is a security event (e.g. intrusion, fall, smoke, flame). The decision maker determines whether there is an intrusion every 100ms in [8]. Ideally the packet data of each participant with the same time stamp shall arrive at the leader participant at the same time. However, the channel status is different for all robots, which will lead to different QoS for the data transmission (e.g., transmission delay, reliability). In some cases, the data packets of a participant often arrive at the leader participant in the last 20ms. Although the data packets of other participants can arrive at the leader participant in the first 20ms, the leader participant still need to cache the received data (for at most 80ms) to wait for data from the last participant. Moreover, the decision cannot be processed until the last data arrives, which means higher computing capability is required. Considering the power consumption and size of UE/robot, the capacity of computing and cache/storage is limited. Synchronous transmission among the participants is required to save storage and simplify the processing logic of the UE/robot. 4. If the leader participant detects a security event based on the processing in step 3, the leader participant decides how to respond and sends control information to other participants. Upon receipt of the control information, all the participants will execute their own respond task cooperatively: some participants may move to their target locations at the specified time, some participants will track the specific target, other participants may broadcast alarm tone, etc. 5. Repeat step 2~4 until the security protection task is completed. The participants can use direct network connection or indirect network connection to communicate with each other based on the quality of communication service of each link. When one of robots is the leader participant, the performance requirement of synchronized transmission may be stricter. Even the leader robot usually is with higher computing capability and intelligence level, it still needs more time to process information, hence the time for communication would be shorten. When the remote security controller is the leader participant, the latency of wireless network transmission may be shorter because of the longer distance. Fig. 5.2.1-2: The schematic diagram of data transmission in a safety protection task All the participants can share latest information synchronously and execute real-time control cooperatively. The assigned security protection task can be completed efficiently.
3239937b55fdc406849fa93870986364
22.916
5.2.2 Related existing service requirements
Clock synchronisation: 3GPP TS 22.104 • clause 5.6.1 Clock synchronisation service level requirements • clause 5.6.2 Clock synchronisation service performance requirements • clause 7.2.3.2 Clock synchronisation requirements Packet Delay Budget: 3GPP TS 23.501 • clause 5.7.4 Standardized 5QI to QoS characteristics mapping TS 22.261 does not contain any requirements for synchronization transmission of multiple UEs.
3239937b55fdc406849fa93870986364
22.916
5.2.3 Challenges and potential gaps
The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable. [CPG-5.2.3-001] 5G system is expected to be able to provide a means to ensure synchronous arrival of associated data transmission for a collaborating group of UEs within a defined accuracy level. [CPG-5.2.3-002] 5G system is expected to be able to provide a means to support resource efficient data transmission when some data from one UE in a group is not needed according to the synchronization need required by trusted 3rd party for the collaborating group that the UE belongs to. NOTE: CPG-5.2.3-002 focuses on preventing further data being transmitted by group members after a trusted 3rd party has determined that the synchronization needs are not met for the time period.
3239937b55fdc406849fa93870986364
22.916
5.3 Smart Communication Support for Data Collection and Fusion Using Multimodal Sensors in Multi-Robot / Multi-Agent Scenarios
3239937b55fdc406849fa93870986364
22.916
5.3.1 General description
This use case considers a smart cooperation scenario for a group of robots to collaboratively build an information set (e.g., dataset or knowledge base in AI/ML) through data/sensor fusion [15] when the fusion of data from multimodal sensors is conducted by a group of robots. NOTE: The term “fusion” in “data fusion”, “sensor fusion” and so on, is also exchangeably used as “integration”. The term “smart” is intended to suggest a concept of consuming low-energy, energy-efficient, resource-efficient and/or situation-aware means of communication to support an intended fusion task for the group of robots. The use of “levels of fusion” is expected to help the United Nations Sustainability Development Goals (SDGs) in several aspects. Providing that the 5G advanced technology enablers are designed in resource-efficient ways for various types of resources (e.g., radio resources, network resources, material, such as battery related), such considerations can also help provide affordable 6G services in the society, especially when certain groups of residents, patients, public-safety officers, or underrepresented need the communication services the most at a critical point in time in their everyday living. Refer to Annex <A> for some examples that have already been identified. There are various scenarios of multi-robot / multi-agent group operations in which a robot should be able to identify certain information (e.g., detecting an object, detecting multiple objects at the same time) or collect data that should be shared with other robots in that group in real-time. Fig. 5.3.1-1. shows two examples. When a robot in a group begins to collect certain data (or information), the robot should determine what it should do with the data, such as whether to share the data without any pre-processing inside the robot (i.e., applications layer role utilizing some input coming from the communications layer), or to perform certain level of pre-processing before sharing the processed form of data with other participants (or participating robots) in the group for a certain task. (a) (b) Fig. 5.3.1-1: Examples of using sensor data where objects are in different dimension/size and/or in different ranges. (a) Two distinct objects (A and B) of the same size at the different range (b) Two distinct objects (A and C) of different sizes at the same range (approximately). Fig. 5.3.1-2 shows an example where both communication needs (i.e., sensor data that are outcome of one of multiple levels of fusion process inside the originating robot) and communication opportunities (i.e., how much communication resources are likely to be available for a robot when there are multiple robots in place) are fluctuating, leading to a complex scheduling load onto 5G systems, such as at a RAN node or a CN node. In order for 5G systems to be able to efficiently and reliably support the dynamic need of transmission opportunities, it is necessary to ensure that robots (as a UE) should be provided with a suitable means to share their intents (e.g., levels of fusion, desired amount of traffic to transmit at certain point in time). Fig. 5.3.1-2: Example of different levels of communications opportunity need (or transmission opportunity need) under a combination of normal and challenging (or extreme) communication conditions. Both communication needs (e.g., traffic volume and communication link availability) can be different and dynamically fluctuating subject to changes in a given environment. Figures 5.3.1-3a presents some examples of underwater sensor network applications and their classifications, as presented by Felemban et al. [24]. Figures 5.3.1-3b presents a common architecture for underwater sensor network (UWSN). The authors presented several categories of applications, such as monitoring applications, disaster applications, military and assisted navigation applications, based on a few underwater models: • 1D-UWSNArchitecture. One-dimensional- (1D-)UWSN architecture refers to a network where the sensor nodes are deployed autonomously. • 2D-UWSN Architecture. Two-dimensional- (2D-) UWSN architecture refers to a network where a group of sensor nodes (cluster) are deployed underwater. • 3D-UWSN Architecture. In this type of network, the sensors are deployed underwater in the form of clusters and are anchored at different depths. • 4D-UWSN Architecture. Four-dimensional- (4D-)UWSN is designed by the combination of fixed UWSN, that is, 3D-UWSN and mobile UWSNs. Fig. 5.3.1-3a: Examples of underwater sensor network applications [24]. Fig. 5.3.1-3b: Underwater sensor network architecture [24].
3239937b55fdc406849fa93870986364
22.916
5.3.2 Related existing service requirements
Clock synchronisation: 3GPP TS 22.104 • clause 5.6.1 Clock synchronisation service level requirements • clause 5.6.2 Clock synchronisation service performance requirements • clause 7.2.3.2 Clock synchronisation requirements NOTE 1: The types of sensor data and media that robots are collecting, pre-processing and sharing with each other and/or with edge cloud (or edge server, cloud server) are related to the need of fulfilling the above sets of requirements. Clock synchronization requirements are mostly related to ProSe communication scenarios. Timing resiliency: 3GPP TS 22.261 • clause 6.36.2 General requirements to ensure timing resiliency • clause 6.36.3 Monitoring and reporting • clause 6.36.4 Exposure NOTE 2: Timing resiliency is considered as a set of preconditions that ensure the “clock synchronization” especially when robots (as a UE) or the leader robot(s) (as opposed to “robot followers”) are served by at least one PLMN. Multi-path relay: 3GPP TS 22.261 • clause 6.9.2.1 support of a traffic flow of a remote UE via different indirect network connection paths Service continuity: 3GPP TS 22.263 • clause 5.5 Service continuity NOTE 3: Service continuity is not necessarily related to all types of sensor data and media. The following aspects are considered as potentially covered by the existing service requirements (Refer to Table 5.3.2-1). It is FFS whether there exist some gaps that are not identified. Table 5.3.2-1: Support of communications layer adaptive to the use of different levels of fusion and to dynamic changes of communication resource availability. Approach by “Levels of Fusion” Existing features / requirements Remarks Thee-level approach [CPG-5.3.2-002a] [Underground model] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [10 m] (depth) x [50 m] (radius, horizontal range). (NOTE 1) [CPG-5.3.2-002b] [Near-ground surface model 1] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [10 m] (height) x [500 m] (radius, horizontal range). (NOTE 2) [CPG-5.3.2-002c] [Near-ground surface model 2] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [10 m] (height) x [50 m] (radius, horizontal range). (NOTE 3) [CPG-5.3.2-002d] [Underwater model] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [50 m] (depth) x [1000 m] (radius, horizontal range on water surface). (NOTE 4) NOTE 1: Radio propagation characteristics can affect the performance but are not the scope of stage-1 study. NOTE 2: Model 1 is related to initial search in, e.g., urban search and rescue scenarios. NOTE 3: Model 2 is related to intensive search in, e.g., urban search and rescue scenarios. NOTE 4: The path loss exponent for underwater (Line of Sight) ranges from 2 to 4. However, compared to fresh water conditions, the seawater is known to have a higher value of water conductivity (greater than 2) and to have a higher absorption loss of electro-magnetic waves.
3239937b55fdc406849fa93870986364
22.916
5.3.3 Challenges and potential gaps
The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable. The following aspects in Table 5.3.3-1 are expected to be supported. Table 5.3.3-1: Support of communications layer adaptive to the use of different levels of fusion and to dynamic changes of communication resource availability. Approach by “Levels of Fusion” Challenges and potential gaps Remarks Thee-level approach [CPG-5.3.3-001] 5G system is expected to be able provide a suitable means with a very high efficiency and reliability to accommodate the dynamic changes at a robot’s application layer related to traffic demand (e.g., caused by using different levels of data/sensor fusion within a robot. (NOTE 1) Editor’s Note: In the above CPG, the degrees of efficiency and reliability are FFS. [CPG-5.3-002] The 5G system is expected to provide a suitable means for the application layer of robots, based on its availability and capability to allocate network resources (e.g., network slice) specifically for robot applications. NOTE 2: This allows for suitable adaptations in the communication layer, addressing dynamic changes occurring in a robot's application layer Six-level approach FFS (NOTE 3) NOTE 1: Examples include a suitable API that can support many intra-robot sessions between robot’s applications layer (e.g., “robot sensing part”, “robot processing part”, “robot actuating part”) and communications layer (e.g., “robot communication part”). NOTE 3: It is considered more complex to apply, compared to three-level approach due to the increased quantity/dimension of computation and communication needs.
3239937b55fdc406849fa93870986364
22.916
5.4 Media-related use cases
3239937b55fdc406849fa93870986364
22.916
5.4.1 General description
3239937b55fdc406849fa93870986364
22.916
5.4.1.1 Video surveillance
The use cases focus on media aspects of service robots composed of visual sensors such as cameras and equipped with capabilities to analyse video signals (e.g. feature extraction, object tracking and detection…) and are able to process this information (i.e., take consequent action). Those use cases are directly related to the standardization effort in MPEG for defining a media compression format called Video Coding for Machines (VCM). Associated MPEG use cases and requirements are available [17]. This use case addresses the scenarios in which automatic object detection and tracking is achieved with video cameras. One illustration is the monitoring of indoor environments in which features of the captured video need to be extracted, such as intrusion detection, fire detection, trusted individuals recognition.The basic service configuration is depicted in the Figure 5.4.11.-1 below. Figure 5.4.1.1-1: Use case on video surveillance In this scenario, video cameras capture a video stream together with related features extracted from pre-processing such as object detection and tracking and are sent to the back-end server for analysis and processing. The uplink bandwidth is expected to be used in an optimized way in a configuration in which there is no need for human understandability of the signal. The image quality requirements are thus limited to the machine interpretation.
3239937b55fdc406849fa93870986364
22.916
5.4.1.2 Intelligent transportation
In a smart traffic system, cars may need to communicate features between each other and other sensors in order to perform different tasks. Sensors in the infrastructure may communicate features towards different vehicles, which then use these features to do object detection, lane tracking, etc. Final processing of these features is done on the individual vehicles. An example is illustrated in Figure 5.4.1.2-1. The front car with multiple cameras sees the surrounding environment and detect and recognize objects such as cars, pedestrians, or street furniture or even recognize events such as traffic jams or accidents potentially using (deep) neural networks. The processed data (feature maps) may be consumed internally for desired tasks and/or the extracted features may be compressed and transmitted to other surrounding cars/infrastructure (e.g., side road cell/grid) for further analysis. Sending a standardized compact bitstream is essential for interoperability between various vendors, IoV (Internet of Vehicle), and IoT (Internet of Things) applications. Figure 5.4.1.2-1: Use case on V2X communications The bandwidth is expected to be used in an optimized way in a configuration in which there is no need for human understandability of the signal. The image quality requirements are thus limited to the machine interpretation.
3239937b55fdc406849fa93870986364
22.916
5.4.2 Related existing service requirements
There is no service requirement known that relates to the bitrate efficiency of robotics video signal, however the Release 16 study on enhancement of 3GPP support for 5G V2X services in TR 22.886 [18] describes the KPIs for video data sharing particularly for machine-centric video data analysis. The following parameters are used as related KPIs: - Latency: less than 10ms; - Data rate 100 – 700 Mbps; and - Reliability 99.99%.
3239937b55fdc406849fa93870986364
22.916
5.4.3 Challenges and potential gaps
The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable. Functionality aspects: [CPG-5.4.3-001] 5G system is expected to convey media signals for machine type communications that support feature extraction and descriptor signalling. NOTE: The type of feature and descriptor is out of scope of this document. Efficiency aspects: [CPG-5.4.3-002] 5G system is expected to meet the service requirements to enable communication between robots using media formats offering at least a better compression efficiency than the 3GPP codecs already specified for human consumption.
3239937b55fdc406849fa93870986364
22.916
5.5 Smart community
3239937b55fdc406849fa93870986364
22.916
5.5.1 General description
This use case considers the service robots in Continuing Care Retirement Community (CCRC). CCRC is the community that provides the continuing care for the elderly. There are several types of assistance and care required in the community. First is the independent living, in which residents could take care of themselves and require limited assistance from the assistants. Second is the assisted living, in which residents need help as needed with daily tasks such as bathing and dressing. Third is the 24-hour nursing home care, which usually in a dedicated skilled nursing facility. The service robots within the community could help to maintain and improve the liveability of the community. CCRC requires a lot of labour to take care of the residents, therefore, service robots could help by liberating the work from human beings. There are multiple types of robots served in the community: 1) patrol robots for crime prevention: Service robots could be equipped with cameras or sensing entities that could record the surroundings or do sensing service. The service robots could record the videos or take photos of the surroundings and then send the video or photo to the server for processing and detect the potential crime. The service robots might not have the computation and resources to process the gathered information from itself, in order to have rapid action on the potential crime, the video or photo could be uploaded to the local edge server for processing. Also, patrol robots could help with the community security and personnel security, such as detecting the fallen elderly around the community for a quick rescue. There could be uncrewed aerial vehicle patrol robots and patrol robots on the ground as shown in the following figure. The cellular coverage usually doesn’t cover the whole community. Relays could be placed within the community to extend the cellular coverage for ensuring the communication of the patrol robots, for example, placing them with the surveillance cameras, which are assumed to be well located within the community. Fig. 5.5.1-1: Patrol robots in the CCRC. Daily patrolling: 1. When the patrol robot is in weak cellular coverage, it automatically switches to the indirect network connection. It could connect to the base station via the stationary relays that are placed with the surveillance camera. When it detects the connection to the base station is well enough, it would switch to the direct network connection. When the patrol robots send the surveillance video to the control centre, they could establish multiple indirect network connection paths. 2. The patrol robot would take videos or photos when it detects the security event from its sensing unit and send to the control centre. 3. During the daily patrolling, the patrol robots might detect the risk of the community safety (e.g. the thief) and the patrol robots will track the thief and report the real-time location to the security department. According to the report from the patrol robots, the security department might provide tracking assistant information and tracking instructions to the patrol robots from its surveillance system. The patrol robots are moving around the community and the connection paths might be changed during the movement of the patrol robots. Service continuity is required for robots to get clear instructions for tracking. Medical assistance event: Fig. 5.5.1-2: Medical assistance event with patrol robots. 1. When an elderly feels uncomfortable at home, he/she could ask for medical assistance from the control centre of the CCRC by pressing a bottom in his/her home. The control centre would dispatch the nearest patrol robots for measuring and monitoring the vital signs of the elderly, such as heart rate, blood pressure and blood oxygen by connecting with the sensing devices. 2. In order to dispatch the nearest patrol robot, the control centre should obtain the location information of the patrol robots, and the patrol robots should be equipped with dynamic path planning to find the nearest path. Communication between the control centre and the patrol robots could be direct network connection or indirect network connection. 3. The patrol robot gets a temporary authorization to connect to the elderly personal smart devices to monitor his/her health condition. 4. When the patrol robot arrives, it connects with the smart watch that the elderly is wearing or smart devices at home with sensing capabilities, which could monitor the vital signs. The vital signs of the elderly could be transmitted to the first-aider via the smart watch and the patrol robot. Also, the patrol robot is equipped with camera that could capture the live video of the elderly for health monitoring. The live video from the camera and the vital signs transmission should be synchronised. 5. Once the first-aider arrives, the first-aider could operate the medical treatment immediately. 2) natural language or gesture recognition from the service robots: Residents could use the service robots with natural language processing capability for smart home appliances, e.g. smart refrigerators, smart speakers, and smart washing machines. Additionally, the gesture recognition capability of the service robots could also help with the remote control of smart home appliances. Natural language and gesture recognition requires processing capabilities, and the traffic of voice or video could be offloaded to the edge server for processing meeting the low latency requirement compared with communication via the cloud computing. 3) smart transportation for delivery robots: Service robots could help to deliver the grocery or parcels to customers. Smart delivery route planning could be based on the local map of the community. The local map might include the privacy information of the resident, which could be stored at the local edge server for privacy consideration. NOTE: Depending on regional or national laws, a delivery robot is considered in a different ways: (a) if they are considered as a vehicle (e.g. in South Korea) they should follow the rules that a regular road vehicle does and some or all of V2X-related aspects are suitable for service scenarios; (b) if they are considered as a vulnerable road user under certain conditions (e.g., speed limit, weight limit), they are allowed to operate in sidewalks (e.g., in State of Pennsylvania) .
3239937b55fdc406849fa93870986364
22.916
5.5.2 Related existing service requirements
Indirect network connection: 3GPP TS 22.261 • Clause 6.9 connectivity models requirements Positioning: 3GPP TS 22.261 • Clause 6.27.2 Positioning services requirements Management of a PIN: 3GPP TS 22.261 • Clause 6.38.2.2 Authorizing/deauthorizing PIN Elements with Management Capability Efficient user plane: 3GPP TS 22.261 • Clause 6.5 efficient user plane V2X aspects: TS 22.186 • Clause 5.3 Advanced Driving • Clause 5.5 Remote Driving NOTE: V2X aspects are suitable for service scenario in countries or regions where a delivery robot is considered a vehicle (e.g., in South Korea).
3239937b55fdc406849fa93870986364
22.916
5.5.3 Challenges and potential gaps
The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable. [CPG-5.5.3-001] 5G system is expected to maintain service continuity of an indirect network connection for a remote UE when the communication path to the network changes while the remote UE is using multiple indirect network connection paths for a single traffic flow.