Replies: 3 comments 6 replies
-
|
You may want to poke @jessegrabowski |
Beta Was this translation helpful? Give feedback.
-
|
Do you have any concrete examples in mind @kratsg? In my experience, usually anything beyond the most trivial examples is not symbolically integrable. |
Beta Was this translation helpful? Give feedback.
-
|
One other comment: the pytensor graph is already close to (is?) an AST representation of an expression, so you probably don't want to stringify it. Instead you can define how different pytensor ops convert to sympy, then just walk the tree applying the conversions. I'm interested in something like this (it's in scope for sympytensor), but I haven't thought carefully about how to handle dimensions. Sympy is fundamentally about scalar operations (plus some very limited support for linear algebra), while pytensor is fundamentally about tensors. So how should |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Using a combination of scan and sympy's integral functionality, I wanted to understand what limitations there might be with the following approach I have in mind.
I am thinking of the concept of a function that takes as input a
pytensorTensorVariablethat would:The goal here is to provide a way to arbitrarily normalize pytensor computational graphs based on the domains of observables being provided at the time of evaluation (not necessarily known ahead of time). What I'm not sure about, just from the exploring I've done, is if this is a solved problem, if it's overkill, or if I should numerically integrate (e.g. gauss-legendre approach + scan) rather than trying to symbolically do it ... or even instead fallback to a numerical integratino with gauss-legendre if the symbolic approach fails.
Beta Was this translation helpful? Give feedback.
All reactions