Skip to content

Output layer doesn't save tensor contents if manually define layers_to_save #46

@raktes

Description

@raktes

If one manually defines layers_to_save either for log_forward_pass or save_new_activations, saving the contents of the output layer doesn't work, unless the tensor contents for its parent is saved.

Additionally, If the parent of the output is saved, the output TensorLogEntry also has the tensor_contents available. This one is actually understandable, since they both would reference the same saved tensor.

A small example:

from torch import nn
import torchlens as tl
import torch

class TestModel(nn.Module):
    def __init__(self):
        super().__init__()

    def forward(self, x):
        x += 2
        x += 1
        return x

arg_test_model = TestModel()
x = torch.rand(6, 1, 3, 3)

mh = tl.log_forward_pass(arg_test_model, x, layers_to_save=["output_1", "iadd_1_1"])
print(mh["output_1"].tensor_contents is not None) # False Missing Despite requested
print(mh["iadd_1_1"].tensor_contents is not None) # True
print(mh["iadd_2_2"].tensor_contents is not None) # False

mh = tl.log_forward_pass(arg_test_model, x, layers_to_save=["iadd_2_2"])
print(mh["output_1"].tensor_contents is not None) # True Not requested but available
print(mh["iadd_1_1"].tensor_contents is not None) # False
print(mh["iadd_2_2"].tensor_contents is not None) # True

mh = tl.log_forward_pass(arg_test_model, x)
print(mh["output_1"].tensor_contents is not None) # True
print(mh["iadd_1_1"].tensor_contents is not None) # True
print(mh["iadd_2_2"].tensor_contents is not None) # True

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions