no code implementations • 2 May 2024 • Matin Mortaheb, Erciyes Karakaya, Mohammad A. Amir Khojastepour, Sennur Ulukus
The transformer structure employed in large language models (LLMs), as a specialized category of deep neural networks (DNNs) featuring attention mechanisms, stands out for their ability to identify and highlight the most relevant aspects of input data.
no code implementations • 30 Apr 2024 • Purbesh Mitra, Sennur Ulukus
In this work, we investigate the staleness criteria for such a system, which is a sufficient condition for convergence of individual user models.
no code implementations • 3 Jan 2024 • Yalin E. Sagduyu, Tugba Erpek, Aylin Yener, Sennur Ulukus
This paper explores opportunities and challenges of task (goal)-oriented and semantic communications for next-generation (NextG) communication networks through the integration of multi-task learning.
no code implementations • 21 Dec 2023 • Yalin E. Sagduyu, Tugba Erpek, Aylin Yener, Sennur Ulukus
Recognizing the computational constraints and trust issues associated with on-device computation, we propose a collaborative system wherein the edge device communicates selectively processed information to a trusted receiver acting as a fusion center, where a decision is made to identify whether a potential transmitter is present, or not.
no code implementations • 21 Nov 2023 • Matin Mortaheb, Mohammad A. Amir Khojastepour, Srimat T. Chakradhar, Sennur Ulukus
The encoded bitrate and the quality of the compressed video depend on encoder parameters, specifically, the quantization parameter (QP).
no code implementations • 8 Nov 2023 • Yalin E. Sagduyu, Tugba Erpek, Aylin Yener, Sennur Ulukus
The transmitter employs a deep neural network, namely an encoder, for joint operations of source coding, channel coding, and modulation, while the receiver utilizes another deep neural network, namely a decoder, for joint operations of demodulation, channel decoding, and source decoding to reconstruct the data samples.
no code implementations • 2 Oct 2023 • Purbesh Mitra, Sennur Ulukus
We consider a gossip network, consisting of $n$ nodes, which tracks the information at a source.
no code implementations • 28 Sep 2023 • Cemil Vahapoglu, Timothy J. O'Shea, Tamoghna Roy, Sennur Ulukus
The advancement of fifth generation (5G) wireless communication networks has created a greater demand for wireless resource management solutions that offer high data rates, extensive coverage, minimal latency and energy-efficient performance.
no code implementations • 27 Sep 2023 • Matin Mortaheb, Mohammad A. Amir Khojastepour, Srimat T. Chakradhar, Sennur Ulukus
The objective is to maintain an encoded video bitrate slightly below the available channel bitrate.
no code implementations • 22 Aug 2023 • Matin Mortaheb, Mohammad A. Amir Khojastepour, Srimat T. Chakradhar, Sennur Ulukus
The experiment with both datasets illustrates that our proposed method is capable of surpassing the SSCC method in reconstructing data with different resolutions, enabling the extraction of semantic features with heightened confidence in successive layers.
no code implementations • 14 Aug 2023 • Yalin E. Sagduyu, Tugba Erpek, Aylin Yener, Sennur Ulukus
A multi-task deep learning approach that involves training a common encoder at the transmitter and individual decoders at the receivers is presented for joint optimization of completing multiple tasks and communicating with multiple receivers.
no code implementations • 21 Jun 2023 • Purbesh Mitra, Sennur Ulukus
The goal of each client is to converge to the global model, while maintaining timeliness of the clients, i. e., having optimum training iteration time.
no code implementations • 11 Jan 2023 • Yalin E. Sagduyu, Sennur Ulukus, Aylin Yener
This paper studies the notion of age in task-oriented communications that aims to execute a task at a receiver utilizing the data at its transmitter.
no code implementations • 21 Dec 2022 • Matin Mortaheb, Sennur Ulukus
Our algorithm uses exchanged gradients to calculate the correlations among tasks automatically, and dynamically adjusts the communication graph to connect mutually beneficial tasks and isolate those that may negatively impact each other.
no code implementations • 21 Dec 2022 • Yalin E. Sagduyu, Tugba Erpek, Sennur Ulukus, Aylin Yener
The backdoor attack can effectively change the semantic information transferred for the poisoned input samples to a target meaning.
no code implementations • 20 Dec 2022 • Yalin E. Sagduyu, Tugba Erpek, Sennur Ulukus, Aylin Yener
By augmenting the reconstruction loss with a semantic loss, the two deep neural networks (DNNs) of this encoder-decoder pair are interactively trained with the DNN of the semantic task classifier.
no code implementations • 19 Dec 2022 • Yalin E. Sagduyu, Sennur Ulukus, Aylin Yener
In this paper, wireless signal classification is considered as the task for the NextG Radio Access Network (RAN), where edge devices collect wireless signals for spectrum awareness and communicate with the NextG base station (gNodeB) that needs to identify the signal label.
no code implementations • 14 Dec 2022 • Cemil Vahapoglu, Matin Mortaheb, Sennur Ulukus
MTL can be integrated into a federated learning (FL) setting if tasks are distributed across clients and clients have a single shared network, leading to personalized federated learning (PFL).
no code implementations • 31 May 2022 • Sajani Vithana, Sennur Ulukus
We investigate the problem of private read update write (PRUW) in federated submodel learning (FSL) with sparsification.
no code implementations • 18 May 2022 • Batuhan Arasli, Sennur Ulukus
We characterize the performance of dynamic individual testing algorithm and introduce a novel dynamic SAFFRON based group testing algorithm.
no code implementations • 24 Mar 2022 • Matin Mortaheb, Cemil Vahapoglu, Sennur Ulukus
In federated settings, the statistical heterogeneity due to different task complexities and data heterogeneity due to non-iid nature of local datasets can both degrade the learning performance of the system.
no code implementations • 21 Dec 2021 • Brian Kim, Tugba Erpek, Yalin E. Sagduyu, Sennur Ulukus
Results from different network topologies show that adversarial perturbation and RIS interaction vector can be jointly designed to effectively increase the signal detection accuracy at the receiver while reducing the detection accuracy at the eavesdropper to enable covert communications.
no code implementations • 16 Sep 2021 • Brian Kim, Yi Shi, Yalin E. Sagduyu, Tugba Erpek, Sennur Ulukus
The DNN that corresponds to a regression model is trained with channel gains as the input and returns transmit powers as the output.
no code implementations • 30 Jul 2021 • Sennur Ulukus, Salman Avestimehr, Michael Gastpar, Syed Jafar, Ravi Tandon, Chao Tian
Most of our lives are conducted in the cyberspace.
no code implementations • 25 Mar 2021 • Brian Kim, Yalin E. Sagduyu, Tugba Erpek, Sennur Ulukus
Deep learning provides powerful means to learn from spectrum data and solve complex tasks in 5G and beyond such as beam selection for initial access (IA) in mmWave communications.
no code implementations • 1 Mar 2021 • Baturalp Buyukates, Emre Ozfatura, Sennur Ulukus, Deniz Gunduz
Distributed implementations are crucial in speeding up large scale machine learning applications.
no code implementations • 14 Jan 2021 • Batuhan Arasli, Sennur Ulukus
We propose a class of two-step sampled group testing algorithms where we exploit the known probabilistic infection spread model.
Information Theory Computers and Society Data Structures and Algorithms Networking and Internet Architecture Signal Processing Information Theory
no code implementations • 31 Dec 2020 • Baturalp Buyukates, Sennur Ulukus
Under the proposed scheme, at each iteration, the PS waits for $m$ available clients and sends them the current model.
no code implementations • 24 Dec 2020 • Melih Bastopcu, Sennur Ulukus
We observe that if the total test rate is limited, instead of testing all members of the population equally, only a portion of the population is tested based on their infection and recovery rates.
Computers and Society Physics and Society
no code implementations • 3 Dec 2020 • Brian Kim, Yalin E. Sagduyu, Tugba Erpek, Kemal Davaslioglu, Sennur Ulukus
The transmitter is equipped with a deep neural network (DNN) classifier for detecting the ongoing transmissions from the background emitter and transmits a signal if the spectrum is idle.
no code implementations • 3 Nov 2020 • Baturalp Buyukates, Emre Ozfatura, Sennur Ulukus, Deniz Gunduz
In distributed synchronous gradient descent (GD) the main performance bottleneck for the per-iteration completion time is the slowest \textit{straggling} workers.
no code implementations • 31 Jul 2020 • Brian Kim, Yalin E. Sagduyu, Tugba Erpek, Kemal Davaslioglu, Sennur Ulukus
First, we show that multiple independent adversaries, each with a single antenna cannot improve the attack performance compared to a single adversary with multiple antennas using the same total power.
no code implementations • 4 Jul 2020 • Emre Ozfatura, Sennur Ulukus, Deniz Gunduz
In this paper, we first introduce a novel coded matrix-vector multiplication scheme, called coded computation with partial recovery (CCPR), which benefits from the advantages of both coded and uncoded computation schemes, and reduces both the computation time and the decoding complexity by allowing a trade-off between the accuracy and the speed of computation.
no code implementations • 2 Jun 2020 • Emre Ozfatura, Baturalp Buyukates, Deniz Gunduz, Sennur Ulukus
To mitigate biased estimators, we design a $timely$ dynamic encoding framework for partial recovery that includes an ordering operator that changes the codewords and computation orders at workers over time.
no code implementations • 15 May 2020 • Brian Kim, Yalin E. Sagduyu, Kemal Davaslioglu, Tugba Erpek, Sennur Ulukus
We consider the problem of hiding wireless communications from an eavesdropper that employs a deep learning (DL) classifier to detect whether any transmission of interest is present or not.
no code implementations • 11 May 2020 • Brian Kim, Yalin E. Sagduyu, Kemal Davaslioglu, Tugba Erpek, Sennur Ulukus
There is a transmitter that transmits signals with different modulation types.
no code implementations • 10 Apr 2020 • Emre Ozfatura, Sennur Ulukus, Deniz Gunduz
When gradient descent (GD) is scaled to many parallel workers for large scale machine learning problems, its per-iteration computation time is limited by the straggling workers.
no code implementations • 5 Feb 2020 • Brian Kim, Yalin E. Sagduyu, Kemal Davaslioglu, Tugba Erpek, Sennur Ulukus
In the meantime, the adversary makes over-the-air transmissions that are received as superimposed with the transmitter's signals to fool the classifier at the receiver into making errors.
no code implementations • 5 Mar 2019 • Emre Ozfatura, Deniz Gunduz, Sennur Ulukus
Gradient descent (GD) methods are commonly employed in machine learning problems to optimize the parameters of the model in an iterative fashion.
no code implementations • 22 Nov 2018 • Emre Ozfatura, Sennur Ulukus, Deniz Gunduz
Coded computation techniques provide robustness against straggling servers in distributed computing, with the following limitations: First, they increase decoding complexity.
no code implementations • 7 Aug 2018 • Emre Ozfatura, Deniz Gunduz, Sennur Ulukus
In most of the existing DGD schemes, either with coded computation or coded communication, the non-straggling CSs transmit one message per iteration once they complete all their assigned computation tasks.