pytorch_geometric: Data Batch problem in PyG

🐛 Describe the bug

Hi. I am a computational physics researcher and was using PyG very well. my pyg code was working well a few weeks ago, but now that I run my code, it is not working anymore without any changes.

the problem is like below. I have many material structures and in my “custom_dataset” class, these are preprocessed and all graph informations (node features, edge features, edge index etc) are inserted into “Data” object in PyTorch geometric. You can see that each preprocessed sample with index $i$ was printed normal “Data” object in pyg

캡처2

But When I insert my custom dataset class into pyg DataLoader and I did like below,

sample = next(iter(train_loader)) # batch sample

batch sample is denoted by “DataDataBatch”. I didn’t see this kind of object name. and i can’t use "sample.x’ or “sample.edge_index” command. Instead I need to do like this

캡처3

I want to use expressions like “sample.x”, “sample.edge_index” or “sample.edge_attr” as like before. I expect your kind explanations. Thank you.

Environment

  • PyG version: 2.0.5
  • PyTorch version: 1.11.0+cu113
  • OS: GoogleColab Pro Plus
  • Python version: Python 3.7.13 in colab
  • CUDA/cuDNN version:
  • How you installed PyTorch and PyG (conda, pip, source):
# Install required packages.
import os
import torch
os.environ['TORCH'] = torch.__version__
print(torch.__version__)
!pip install -q torch-scatter -f https://data.pyg.org/whl/torch-${TORCH}.html
!pip install -q torch-sparse -f https://data.pyg.org/whl/torch-${TORCH}.html
!pip install -q git+https://github.com/pyg-team/pytorch_geometric.git
!pip install -q pymatgen==2020.11.11  
  • Any other relevant information (e.g., version of torch-scatter):

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Comments: 31 (15 by maintainers)

Most upvoted comments

Hi @rusty1s Thanks for your response! Indeed, I’m a little bit busy these days, and I’m already using the filtering solution. I hope I’ll find some time to debug it soon.