๐ LWM: Large Wireless Model
LWM is a powerful pre - trained model developed as a universal feature extractor for wireless channels. As the world's first foundation model in this domain, it uses transformer architectures to extract refined representations from simulated datasets (like DeepMIMO and Sionna) and real - world wireless data.
๐ Click here to try the Interactive Demo!
[๐ Click here to try the Colab Notebook!](https://colab.research.google.com/drive/1a_eNi - HG79CY - iwnnlyR41uL8PrG7EIj?usp=sharing)
โจ Features
How is LWM built?
The LWM model is based on transformers, which can capture both fine - grained and global dependencies within channel data. Different from traditional models limited to specific tasks, LWM uses a self - supervised approach through the proposed Masked Channel Modeling (MCM). This method trains the model on unlabeled data by predicting masked channel segments, enabling it to learn intricate relationships between antennas and subcarriers. With bidirectional attention, LWM interprets the full context by attending to both preceding and succeeding channel segments. The resulting embeddings encode comprehensive spatial information and are applicable to various scenarios.
What does LWM offer?
LWM provides a universal feature extraction framework applicable to diverse wireless communication and sensing tasks. It can handle complex wireless environments, capturing channel characteristics for robust performance across different scenarios and conditions. Trained on hundreds of thousands of wireless channel samples, LWM generalizes across varied environments, from dense urban areas to synthetic setups, ensuring adaptability and consistency across a broad spectrum of wireless tasks.
How is LWM used?
LWM is designed to be easily integrated into downstream applications as a source of high - quality embeddings that encapsulate complex channel features. By feeding raw wireless channel data into the pre - trained model, users get embeddings that capture essential spatial relationships and interactions within the channel environment. These embeddings offer a versatile and contextualized representation of wireless data, which can be used in different applications. Using the pre - trained model in this way reduces the need for extensive labeled data while retaining the critical properties of the original channel.
Advantages of Using LWM
- Various Tasks: Self - supervised and pre - trained without labels, LWM excels in a wide range of wireless tasks, offering flexibility and performance.
- Limited Data: With LWM embeddings, downstream tasks achieve high accuracy with less data, reducing reliance on large datasets.
- Various Environments: Pre - trained on diverse data, LWM performs well in various environments from urban to rural areas, ensuring reliable performance.
Join the growing community of researchers using LWM for their wireless communication and sensing research, and unlock a new level of performance and insight in your models!
๐ฆ Installation
1. Install Conda
First, make sure you have a package manager like Conda installed to manage your Python environments and packages. You can install Conda via Anaconda or Miniconda.
- Anaconda: It includes a comprehensive scientific package suite. Download it here.
- Miniconda: It is a lightweight version that includes only Conda and Python. Download it here.
Once installed, you can use Conda to manage environments.
2. Create a New Environment
After installing Conda, follow these steps to create a new environment and install the required packages.
Step 1: Create a new environment
Open the Anaconda PowerShell Prompt and create a new Conda environment named lwm_env
:
conda create -n lwm_env
Step 2: Activate the environment
Activate the environment:
conda activate lwm_env
3. Install Required Packages
Once the environment is activated, install the necessary packages.
Install CUDA - enabled PyTorch
Although inference can run efficiently on a CPU, you may need a GPU for training more resource - intensive downstream tasks. Visit [this page](https://pytorch.org/get - started/locally/) and select the appropriate options based on your system's specifications. The website will generate a tailored installation command.
For example, on an NVIDIA system, you can use a command like the following with the appropriate CUDA version for your system:
conda install pytorch torchvision torchaudio pytorch - cuda = 12.1 - c pytorch - c nvidia
This command installs PyTorch with CUDA support for GPU - accelerated training. Ensure that the specified CUDA version is compatible with your system, adjusting it if necessary.
โ ๏ธ Important Note
If you encounter issues installing CUDA - enabled PyTorch, verify your CUDA version compatibility. It might also be due to conflicting installation attemptsโtry a fresh environment.
Install Other Required Packages via Conda Forge
conda install python numpy pandas matplotlib tqdm - c conda - forge
Install DeepMIMOv3 with pip
pip install DeepMIMOv3
4. Clone the Dataset Scenarios
The following functions will help you clone specific dataset scenarios from a repository:
import subprocess
import os
import shutil
def clone_dataset_scenario(repo_url, model_repo_dir="./LWM", scenarios_dir="scenarios"):
"""
Clones all scenarios from a repository, ensuring all files (small and large) are downloaded.
Args:
repo_url (str): URL of the Git repository
model_repo_dir (str): Path to the model repository
scenarios_dir (str): Directory name for storing scenarios
"""
current_dir = os.path.basename(os.getcwd())
if current_dir == "LWM":
model_repo_dir = "."
scenarios_path = os.path.join(model_repo_dir, scenarios_dir)
os.makedirs(scenarios_path, exist_ok=True)
original_dir = os.getcwd()
try:
if os.path.exists(scenarios_path):
shutil.rmtree(scenarios_path)
print(f"Cloning entire repository into temporary directory...")
subprocess.run([
"git", "clone",
repo_url,
scenarios_path
], check=True)
os.chdir(scenarios_path)
print(f"Pulling all files using Git LFS...")
subprocess.run(["git", "lfs", "install"], check=True)
subprocess.run(["git", "lfs", "pull"], check=True)
print(f"Successfully cloned all scenarios into {scenarios_path}")
except subprocess.CalledProcessError as e:
print(f"Error cloning scenarios: {str(e)}")
finally:
if os.path.exists(scenarios_path):
shutil.rmtree(scenarios_path)
os.chdir(original_dir)
5. Clone the Model Repository
Now, clone the LWM model repository to your local system.
model_repo_url = "https://huggingface.co/wi - lab/lwm"
model_repo_dir = "./LWM"
if not os.path.exists(model_repo_dir):
print(f"Cloning model repository from {model_repo_url}...")
subprocess.run(["git", "clone", model_repo_url, model_repo_dir], check=True)
6. Clone the Desired Dataset Scenarios
You can now clone specific scenarios from the DeepMIMO dataset, as detailed in the table below:
๐ Dataset Overview
๐ Dataset |
๐๏ธ City |
๐ฅ Number of Users |
๐ DeepMIMO Page |
Dataset 0 |
๐ Denver |
1354 |
[DeepMIMO City Scenario 18](https://www.deepmimo.net/scenarios/deepmimo - city - scenario18/) |
Dataset 1 |
๐๏ธ Indianapolis |
3248 |
[DeepMIMO City Scenario 15](https://www.deepmimo.net/scenarios/deepmimo - city - scenario15/) |
Dataset 2 |
๐ Oklahoma |
3455 |
[DeepMIMO City Scenario 19](https://www.deepmimo.net/scenarios/deepmimo - city - scenario19/) |
Dataset 3 |
๐ Fort Worth |
1902 |
[DeepMIMO City Scenario 12](https://www.deepmimo.net/scenarios/deepmimo - city - scenario12/) |
Dataset 4 |
๐ Santa Clara |
2689 |
[DeepMIMO City Scenario 11](https://www.deepmimo.net/scenarios/deepmimo - city - scenario11/) |
Dataset 5 |
๐
San Diego |
2192 |
[DeepMIMO City Scenario 7](https://www.deepmimo.net/scenarios/deepmimo - city - scenario7/) |
It is important to note that these six datasets were not used during the pre - training of the LWM model, and the high - quality embeddings produced are a testament to LWMโs robust generalization capabilities rather than overfitting.
The operational settings below were used in generating the datasets for both the pre - training of LWM and the downstream tasks. If you intend to use custom datasets, please ensure they adhere to these configurations:
Operational Settings:
- Antennas at BS: 32
- Antennas at UEs: 1
๐ License
Please cite the following paper if you use the LWM model or any modified parts:
@misc{alikhani2024largewirelessmodellwm,
title={Large Wireless Model (LWM): A Foundation Model for Wireless Channels},
author={Sadjad Alikhani and Gouranga Charan and Ahmed Alkhateeb},
year={2024},
eprint={2411.08872},
archivePrefix={arXiv},
primaryClass={cs.IT},
url={https://arxiv.org/abs/2411.08872},
}