Now with collaborative Session in ARKit 3 you are continuosly sharing the mapping information between multiple devices across the network. This allows to create Ad-hoc multi-user experiences and additionally to share ARAnchors on all devices. All those anchors are identifiable with sessions ID’s on all devices At this point all coordinate systems are indipendent from each others even we share the information under the hood.

In Figure 1 two user gather and share feature points in the world space. The two maps merge into each other and will form one map only. Additionally the other user will be shown as ArParticipantAnchor too which will allows you to detect when another user is in your environment. It is not limited to two user but you can have a large amount of users in one session.
Can you imagine what would be possible with this technology and 5G?
I think Apple would use a technology similar to iOS 13 find my iPhone to combine discovered feature points of near devices. It even allows apple to create AR Cloud.
That’s Amazing.

2020 iPhones come with 5G and rear camera laser scanner, Combination of these technologies let Apple to 3d characteristics of the world in a cloud system. It even allows Apple to create Google Earth equivalent for the inside of the buildings.
I believe in 2022 we will see a drastic paradigm shift in how Computing and User interaction work

Sources:1.https://multitudes.github.io/2019/07/Introducing-ARKit3.html