-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Raise NotImplementedError for SplineWrapper gradient operation #2211
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 5 commits
d3d947e
4d510f4
a85bd3e
532235a
8fab095
b31d6f3
5da1ce2
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -365,6 +365,7 @@ def conjugate_solve_triangular(outer, inner): | |
grad = tt.triu(s + s.T) - tt.diag(tt.diagonal(s)) | ||
return [tt.switch(ok, grad, floatX(np.nan))] | ||
|
||
|
||
class SplineWrapper (theano.Op): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. pep8 does not recommend spacing before opening brackets There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I've fixed all these issues. But just to argue about PEP8... :) PEP8 forbids spaces after functions or methods, but doesn't say about a whitespace before the base class:
Also, it says that
so a blank line before the first method in a class is legitimate as it surrounds a class method, it is even possible to argue that it is required by PEP8 there. Actually $ cat t.py
class Foo (object):
def foo(self):
print('hi')
class Bar(object):
def bar(self):
print('hi')
$ pep8 t.py
$ # no errors reported And as PEP8 is a formalized description of the code style used in Python standard library, we can check what is used in it, and it turns out that both variants are present there:
However it was more just to argue, I've already committed the fixes and I see that we need to not just follow PEP8, but also be consistent with other parts of PyMC3, so I agree with your nitpicks and find them to be absolutely reasonable :) As an off topic, I actually think that it would be nice to have |
||
""" | ||
Creates a theano operation from scipy.interpolate.UnivariateSpline | ||
|
@@ -377,22 +378,24 @@ class SplineWrapper (theano.Op): | |
def __init__(self, spline): | ||
self.spline = spline | ||
|
||
@property | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Nice! |
||
def grad_op(self): | ||
if not hasattr(self, '_grad_op'): | ||
try: | ||
self._grad_op = SplineWrapper(self.spline.derivative()) | ||
except ValueError: | ||
self._grad_op = None | ||
|
||
if self._grad_op is None: | ||
raise NotImplementedError('Spline of order 0 is not differentiable') | ||
return self._grad_op | ||
|
||
def perform(self, node, inputs, output_storage): | ||
x, = inputs | ||
output_storage[0][0] = np.asarray(self.spline(x)) | ||
|
||
class DifferentiableSplineWrapper (SplineWrapper): | ||
""" | ||
Creates a theano operation with defined gradient from | ||
scipy.interpolate.UnivariateSpline | ||
""" | ||
|
||
def __init__(self, spline): | ||
super(DifferentiableSplineWrapper, self).__init__(spline) | ||
self.spline_grad = SplineWrapper(spline.derivative()) | ||
self.__props__ += ('spline_grad',) | ||
|
||
def grad(self, inputs, grads): | ||
x, = inputs | ||
x_grad, = grads | ||
return [x_grad * self.spline_grad(x)] | ||
|
||
return [x_grad * self.grad_op(x)] | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. You can create new op right here. Pure theano code is expected for grad method There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. To avoid O(n) calculations on each call of There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. You can memorize the call There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Memorize op creation to be more accurate There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, I thought about it. But I'm concerned that in this case gradient calculation time becomes non-deterministic. For example it might significantly bias There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why? Functions are compiled after graph is constructed. That will not affect runtime There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hmm, you are right. I added lazy creation of the derivatives. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this line is too long, we can wrap this import in brackets and do multiline import