Skip to content

Conversation

@JATAYU000
Copy link

@JATAYU000 JATAYU000 commented Nov 28, 2025

Reference Issues/PRs

Fixes #1899

What does this implement/fix? Explain your changes.

Have started interfacing iTransformer in PTFv2, from the TSlib repository thuml/iTransformer
Work in progress, would like to have suggestions on it.

What should a reviewer concentrate their feedback on?

  • Current Implementation compliance with PTFv2

Did you add any tests for the change?

Not yet

Any other comments?

PR checklist

  • The PR title starts with either [ENH], [MNT], [DOC], or [BUG]. [BUG] - bugfix, [MNT] - CI, test framework, [ENH] - adding or improving code, [DOC] - writing or improving documentation or docstrings.
  • Added/modified tests
  • Used pre-commit hooks when committing to ensure that code is compliant with hooks. Install hooks with pre-commit install.
    To run hooks independent of commit, execute pre-commit run --all-files

Copy link
Contributor

@PranavBhatP PranavBhatP left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @JATAYU000 . I've dropped some comments on the PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Conventionally for v2, all the layers of the models' architecture are present in the layers directory. Many of the layers you are using here can be directly imported from this directory, I see a lot of commonality. I would suggest only adding new layers (if not already present in layers) as a subdirectory - layers/_<layer-type>. sub_modules.py is a v1 convention

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found some changes in some layers and let it be in submodules for now in draft pr, will fix this .

:, :, :N
] # filter covariates

if self.use_norm:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.use_norm is not required in the model code since it will be handled by the D1/D2 layers. Normalization and denomalization need not be handled here. Simply return dec_out.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh Thank you for pointing that out.

}

@classmethod
def get_test_train_params(cls):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a few more test cases here?

"""
An implementation of iTransformer model for v2 of pytorch-forecasting.

Parameters
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Docstring for model hyperparameters is missing.

@JATAYU000
Copy link
Author

I would suggest only adding new layers (if not already present in layers) as a subdirectory - layers/_

@PranavBhatP The EncoderLayer in layers/_encoders/ requires cross_attention , but iTransformer only needs self_attention.
Should I make cross_attention optional in the existing layer? or create a separate EncoderLayer for iTransformer?

@JATAYU000 JATAYU000 requested a review from PranavBhatP December 1, 2025 14:50
@fkiraly
Copy link
Collaborator

fkiraly commented Dec 3, 2025

re layers, I would do as follows:

  • if exact same layer is available in layers, reuse it
  • add new layers in layers
  • if layer with modification is needed, add it as a separate layer
  • optionally - this PR but also can be later PR - check if multiple similar layers can be "unified" in a single layer with more parameters

@PranavBhatP
Copy link
Contributor

optionally - this PR but also can be later PR - check if multiple similar layers can be "unified" in a single layer with more parameters

@fkiraly seems like a nice good first issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[ENH] implementing iTransformer from tslib.

3 participants