Detect network connectivity issues using RTCPeerConnection.getStats()

I’m using stream.pc.peerConnection.getStats() in the client side to fetch and detect user connectivity issues. However when I impose packet loss on the publisher side, I see an increase of packet loss as well as NACK and PLI count in the subscriber side. Is this normal? If yes then I wonder how would it be possible to distinguish between publisher and subscriber connection issues using stats report?

The test in reverse is fine so when I simulate packet loss on the subscriber side it doesn’t affect the publisher stats report.

The worst is it happens for RTT (round-trip time) in both scenarios and any latency in one peer, including publisher and subscriber, is just reported (forwarded) to other peers! This makes the getStats() completely useless for detecting peer connectivity problems.