Semantic Multi-Resolution Communications

by   Matin Mortaheb, et al.

Deep learning based joint source-channel coding (JSCC) has demonstrated significant advancements in data reconstruction compared to separate source-channel coding (SSCC). This superiority arises from the suboptimality of SSCC when dealing with finite block-length data. Moreover, SSCC falls short in reconstructing data in a multi-user and/or multi-resolution fashion, as it only tries to satisfy the worst channel and/or the highest quality data. To overcome these limitations, we propose a novel deep learning multi-resolution JSCC framework inspired by the concept of multi-task learning (MTL). This proposed framework excels at encoding data for different resolutions through hierarchical layers and effectively decodes it by leveraging both current and past layers of encoded data. Moreover, this framework holds great potential for semantic communication, where the objective extends beyond data reconstruction to preserving specific semantic attributes throughout the communication process. These semantic features could be crucial elements such as class labels, essential for classification tasks, or other key attributes that require preservation. Within this framework, each level of encoded data can be carefully designed to retain specific data semantics. As a result, the precision of a semantic classifier can be progressively enhanced across successive layers, emphasizing the preservation of targeted semantics throughout the encoding and decoding stages. We conduct experiments on MNIST and CIFAR10 dataset. The experiment with both datasets illustrates that our proposed method is capable of surpassing the SSCC method in reconstructing data with different resolutions, enabling the extraction of semantic features with heightened confidence in successive layers. This capability is particularly advantageous for prioritizing and preserving more crucial semantic features within the datasets.


page 1

page 2


Infomax Neural Joint Source-Channel Coding via Adversarial Bit Flip

Although Shannon theory states that it is asymptotically optimal to sepa...

Joint Task and Data Oriented Semantic Communications: A Deep Separate Source-channel Coding Scheme

Semantic communications are expected to accomplish various semantic task...

Deep Learning for Joint Source-Channel Coding of Text

We consider the problem of joint source and channel coding of structured...

Semantic Communications for Image Recovery and Classification via Deep Joint Source and Channel Coding

With the recent advancements in edge artificial intelligence (AI), futur...

Disentangling Learnable and Memorizable Data via Contrastive Learning for Semantic Communications

Achieving artificially intelligent-native wireless networks is necessary...

Evolving knowledge through negotiation

Semantic web information is at the extremities of long pipelines held by...

Please sign up or login with your details

Forgot password? Click here to reset