Skip to content

Raise NotImplementedError for SplineWrapper gradient operation #2211

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from May 27, 2017
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions pymc3/distributions/continuous.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from pymc3.theanof import floatX
from . import transforms

from .dist_math import bound, logpow, gammaln, betaln, std_cdf, i0, i1, alltrue_elemwise, DifferentiableSplineWrapper
from .dist_math import bound, logpow, gammaln, betaln, std_cdf, i0, i1, alltrue_elemwise, SplineWrapper
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this line is too long, we can wrap this import in brackets and do multiline import

from .distribution import Continuous, draw_values, generate_samples, Bound

__all__ = ['Uniform', 'Flat', 'Normal', 'Beta', 'Exponential', 'Laplace',
Expand Down Expand Up @@ -1430,7 +1430,7 @@ def __init__(self, x_points, pdf_points, transform='interval',
Z = interp.integral(x_points[0], x_points[-1])

self.Z = tt.as_tensor_variable(Z)
self.interp_op = DifferentiableSplineWrapper(interp)
self.interp_op = SplineWrapper(interp)
self.x_points = x_points
self.pdf_points = pdf_points / Z
self.cdf_points = interp.antiderivative()(x_points) / Z
Expand Down
24 changes: 12 additions & 12 deletions pymc3/distributions/dist_math.py
Original file line number Diff line number Diff line change
Expand Up @@ -365,6 +365,7 @@ def conjugate_solve_triangular(outer, inner):
grad = tt.triu(s + s.T) - tt.diag(tt.diagonal(s))
return [tt.switch(ok, grad, floatX(np.nan))]


class SplineWrapper (theano.Op):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pep8 does not recommend spacing before opening brackets

Copy link
Author

@ghost ghost May 27, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've fixed all these issues. But just to argue about PEP8... :)

PEP8 forbids spaces after functions or methods, but doesn't say about a whitespace before the base class:

Avoid extraneous whitespace in the following situations:
... (nothing interesting)
Immediately before the open parenthesis that starts the argument list of a function call
... (nothing interesting)

Also, it says that

Method definitions inside a class are surrounded by a single blank line

so a blank line before the first method in a class is legitimate as it surrounds a class method, it is even possible to argue that it is required by PEP8 there.

Actually pep8 linter seems to agree with both ways:

$ cat t.py 
class Foo (object):

    def foo(self):
        print('hi')


class Bar(object):
    def bar(self):
        print('hi')
$ pep8 t.py
$ # no errors reported

And as PEP8 is a formalized description of the code style used in Python standard library, we can check what is used in it, and it turns out that both variants are present there:

However it was more just to argue, I've already committed the fixes and I see that we need to not just follow PEP8, but also be consistent with other parts of PyMC3, so I agree with your nitpicks and find them to be absolutely reasonable :)

As an off topic, I actually think that it would be nice to have pep8 or flake8 running automatically in Travis and failing builds in case of PEP8 violations. However I briefly investigated this possibility and it seems like it is necessary to either fix some rather large parts of the existing code and/or agree on exceptions from PEP8 that we don't want to follow, like for example maximum line length.

"""
Creates a theano operation from scipy.interpolate.UnivariateSpline
Expand All @@ -381,18 +382,17 @@ def perform(self, node, inputs, output_storage):
x, = inputs
output_storage[0][0] = np.asarray(self.spline(x))

class DifferentiableSplineWrapper (SplineWrapper):
"""
Creates a theano operation with defined gradient from
scipy.interpolate.UnivariateSpline
"""

def __init__(self, spline):
super(DifferentiableSplineWrapper, self).__init__(spline)
self.spline_grad = SplineWrapper(spline.derivative())
self.__props__ += ('spline_grad',)

def grad(self, inputs, grads):
x, = inputs
x_grad, = grads
return [x_grad * self.spline_grad(x)]

if not hasattr(self, 'grad_op'):
try:
self.grad_op = SplineWrapper(self.spline.derivative())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about using property?

Copy link
Author

@ghost ghost May 24, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we don't want to add grad_op to __props__ because it would break __hash__() and __eq__(). For example, this code works now as expected, but wouldn't work if grad_op is added to __props__:

s = InterpolatedUnivariateSpline(...)
op1 = SplineWrapper(s)
op2 = SplineWrapper(s)
op2.grad(...) # calculate gradient
assert op1 == op2

One would expect op1 to be equal to op2 because they wrap the same spline, but if spline_op is a property, it won't be the case.

As far as I understand, the purpose of __props__ is to be used for implementation of correct comparisons:

__props__ enables the automatic generation of appropriate __eq__() and __hash__(). Given the method __eq__(), automatically generated from __props__, two ops will be equal if they have the same values for all the properties listed in __props__. Given to the method __hash__() automatically generated from __props__, two ops will be have the same hash if they have the same values for all the properties listed in __props__. __props__ will also generate a suitable __str__() for your op. This requires development version after September 1st, 2014 or version 0.7.

http://deeplearning.net/software/theano/extending/extending_theano.html#op-s-auxiliary-methods

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, isn't derivative deterministic? In that case the only prop (spline) is required

Copy link
Author

@ghost ghost May 24, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that's why we have only one Theano prop, spline. And grad_op is just a helper property for internal usage, that is needed to avoid unnecessary recalculations of the derivative spline, but not a Theano property.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Under property I meant

@property
def grad_op(self):...

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But we don't want to expose it to the user, don't we?

Copy link
Author

@ghost ghost May 24, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean what would possible usages be? This class is anyway internal to PyMC3, so I think that we need in it only the functionality that is actually used by PyMC3.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moreover it would be more pretty

Copy link
Author

@ghost ghost May 27, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ferrine Hmm, why not indeed :) Added it in a new commit.

except ValueError:
self.grad_op = None

if self.grad_op is None:
raise NotImplementedError('Spline of order 0 is not differentiable')
else:
return [x_grad * self.grad_op(x)]