Skip to content

Start to handle branching in simple cases #16

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
narendasan opened this issue Mar 18, 2020 · 1 comment · Fixed by #81
Closed

Start to handle branching in simple cases #16

narendasan opened this issue Mar 18, 2020 · 1 comment · Fixed by #81
Assignees
Labels
component: conversion Issues re: Conversion stage component: lowering Issues re: The lowering / preprocessing passes feature request New feature or request priority: high
Milestone

Comments

@narendasan
Copy link
Collaborator

The system works pretty well for traced models, but not much work has been done with torch script models that have branching. I noticed some common cases that we should be able to handle include branching for none arguments such as graphs like this:

  %50 : Function = prim::Constant[name="linear"]()
  %53 : bool = prim::Constant[value=0]() # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:7
  %54 : None = prim::Constant() # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:40
  %55 : int = prim::Constant[value=2]() # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:22
  %56 : int = prim::Constant[value=1]() # :0:0
  %57 : int = aten::dim(%input1.1) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:7
  %58 : bool = aten::eq(%57, %55) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:7
  %59 : bool = prim::If(%58) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:7
    block0():
      %60 : bool = aten::__isnot__(%94, %54) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:28
      -> (%60)
    block1():
      -> (%53)
  %input2.1 : Tensor = prim::If(%59) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1368:4
    block0():
      %bias0.4 : Tensor = prim::unchecked_cast(%94)
      %101 : Tensor = aten::linear(%input1.1, %95, %bias0.4)
      -> (%101)
    block1():
      %106 : Tensor? = prim::Constant()
      %107 : Tensor = aten::linear(%input1.1, %95, %106)
      %67 : bool = aten::__isnot__(%94, %54) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1373:11
      %output0.6 : Tensor = prim::If(%67) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1373:8
        block0():
          %bias1.4 : Tensor = prim::unchecked_cast(%94)
          %output0.7 : Tensor = aten::add_(%107, %bias1.4, %56) # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1374:12
          -> (%output0.7)
        block1():
          -> (%107)
      -> (%output0.6)
@narendasan narendasan added feature request New feature or request component: lowering Issues re: The lowering / preprocessing passes component: conversion Issues re: Conversion stage priority: high labels Mar 18, 2020
@narendasan
Copy link
Collaborator Author

Relevant PR https://github.com/pytorch/pytorch/pull/32178/files

@narendasan narendasan added this to the v0.0.2 milestone Apr 1, 2020
@narendasan narendasan self-assigned this Jun 2, 2020
frank-wei pushed a commit that referenced this issue Jun 4, 2022
Summary: Pull Request resolved: pytorch/fx2trt#16

Reviewed By: yinghai, wushirong

Differential Revision: D34772842

fbshipit-source-id: fd2577e54b1e5f563df2e1052773ef7b19069abe
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: conversion Issues re: Conversion stage component: lowering Issues re: The lowering / preprocessing passes feature request New feature or request priority: high
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant