0

I'm encountering a runtime error while manipulating the node count in a Networkx-generated graph passed through a Graph Neural Network (GNN). Here's my GNN code, which seems independent of the graph's node count:

class GCN(nn.Module):
    def __init__(self, input_size, hidden_size, num_classes):
        super(GCN, self).__init__()
        self.layer1 = GCNConv(input_size, hidden_size)
        self.layer2 = GCNConv(hidden_size, hidden_size)
        self.layer3 = GCNConv(hidden_size, num_classes)
        self.softmax = nn.Softmax(dim=0)

    def forward(self, node_features, edge_index):
        output = self.layer1(node_features, edge_index)
        output = torch.relu(output)
        output = self.layer2(output, edge_index)
        output = torch.relu(output)
        output = self.layer3(output, edge_index)
        output = self.softmax(output)

        return output

This is how I am creating the graph and removing a node from it.

def generate_graph(num_nodes):
    # generate weighted and connected graph
    Graph = nx.gnm_random_graph(num_nodes, random.randint(num_nodes, num_nodes*2), seed=42)
    while not nx.is_connected(Graph):
        Graph = nx.gnm_random_graph(num_nodes, random.randint(num_nodes, num_nodes*2), seed=42)

    # add features to nodes
    # node 0 will be the source node
    # each node will have a feature of 3
    # first feature will represent the node's bias (a random value between 0 and 1)
    # second feature will represent if the node is a source node (0 or 1, 1 if the node is the source node)
    # third feature will represent the node's degree
    for node in Graph.nodes:
        Graph.nodes[node]['feature'] = [random.random(), 1 if node == 0 else 0, Graph.degree[node]]

    node_features = Graph.nodes.data('feature')
    node_features = torch.tensor([node_feature[1] for node_feature in node_features])
    edge_index = torch.tensor(list(Graph.edges)).t().contiguous()

    return Graph, node_features, edge_index


def remove_node_from_graph(Graph, node):
    # remove the node from the graph
    Graph.remove_node(node)

    # update the features of the nodes
    for node in Graph.nodes:
        Graph.nodes[node]['feature'][2] = Graph.degree[node]

    node_features = Graph.nodes.data('feature')
    node_features = torch.tensor([node_feature[1] for node_feature in node_features])
    edge_index = torch.tensor(list(Graph.edges)).t().contiguous()

    return Graph, node_features, edge_index

Training my GCN with a 10-node graph succeeds, but when I remove one node and pass the modified graph through the GCN, I encounter the error:

RuntimeError: index 9 is out of bounds for dimension 0 with size 9

Surprisingly, the process works fine when I generate a new 9-node graph after the initial training step. I'm struggling to pinpoint where I might be making a mistake. Any insights would be greatly appreciated!

3
  • always put FULL error message (starting at word "Traceback") in question (not in comments) as text (not screenshot, not link to external portal). There are other useful information in the full error/traceback. Commented Dec 17, 2023 at 1:50
  • if you have 9 nodes then they have numbers 0...8 but it seems somewhere you use 9 Commented Dec 17, 2023 at 1:52
  • Maybe first use print() (and print(type(...)), print(len(...)), etc.) to see which part of code is executed and what you really have in variables. It is called "print debuging" and it helps to see what code is really doing. Commented Dec 17, 2023 at 1:53

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.