Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
862e800
Update numpy version constraint to allow 1.24.x
rly Nov 21, 2025
51a42e9
Merge pull request #1 from rly/feature/numpy-1.24.0-support
rly Nov 21, 2025
a25e4c7
Fix pytest 9 deprecation warnings
rly Nov 21, 2025
0903e7f
Merge pull request #3 from rly/feature/fix-pytest-deprecations
rly Nov 21, 2025
d88e9c4
Update moto to v5+ and fix mock_s3 to mock_aws imports
rly Nov 21, 2025
c80bcfe
Remove redundant decorator
rly Nov 21, 2025
29ac167
Merge pull request #4 from rly/feature/fix-cloud-cache-tests
rly Nov 22, 2025
a47faab
Fix pynwb compatibility issues
rly Nov 22, 2025
a335024
Fix pynwb compatibility issues
rly Nov 22, 2025
5341672
Merge pull request #5 from rly/feature/fix-nwb-tests
rly Nov 22, 2025
79c4c2e
Fix pytest.warns(None) TypeError in newer pytest versions
rly Nov 22, 2025
31f3763
Fix GitHub Actions for macos-latest ARM64 transition
rly Nov 22, 2025
078c2a7
Fix aiohttp ClientSession creation outside async context
rly Nov 22, 2025
6c390a5
Merge pull request #6 from rly/feature/fix-misc-test-errors
rly Nov 22, 2025
89c8c45
Fix numpy 1.24+ inhomogeneous array in gaze_mapper
rly Nov 23, 2025
88fe8d9
Fix test_fitgaussian2D_failure mock for numpy 1.24+ compatibility
rly Nov 23, 2025
bc9ca74
Merge pull request #7 from rly/feature/fix-inhomogeneous-array-errors
rly Nov 23, 2025
8a2e8f8
Drop Python 3.8/3.9 support and handle missing glymur gracefully
rly Nov 24, 2025
3f7c9cf
Remove Python 3.12 support due to numpy 1.24.x incompatibility
rly Nov 24, 2025
a1d6897
Merge pull request #8 from rly/feature/fix-jpeg-glymur-errors
rly Nov 24, 2025
3cb026c
Update macOS CI runners to macos-15-intel and add macos-latest (ARM64)
rly Jan 11, 2026
00608e7
Add tolerance for curve fitting tests on ARM64
rly Jan 11, 2026
6749d2d
Add numpy 1.26.x support and Python 3.12/3.13 compatibility
rly Jan 11, 2026
7ef70d4
Allow pandas 1.5.3+ to maintain compatibility with existing code
rly Jan 11, 2026
4f3cc5b
Remove scipy version constraint to allow pip to resolve based on Pyth…
rly Jan 11, 2026
c017460
Add Python 3.14 support
rly Jan 11, 2026
e2b3573
Remove Python 3.14 support due to missing dependency wheels
rly Jan 11, 2026
f926373
Fix scipy 1.14 and pandas 2.x compatibility issues
rly Jan 11, 2026
0ab1a89
Fix scipy and pandas compatibility issues for Python 3.10+
rly Jan 11, 2026
33e6129
Fix platform-specific test failures for ARM64 and Windows
rly Jan 11, 2026
0aa6547
Fix cross-platform test compatibility issues
rly Jan 11, 2026
0d85782
Add missing check_index_type=False to test_get_stimulus_presentations
rly Jan 12, 2026
9af6eea
Update test dependencies for Python 3.12/3.13 compatibility
rly Jan 12, 2026
fa8b8af
Fix mock assertion typos and disable parallel testing on Windows 3.12+
rly Jan 12, 2026
d380c04
Fix test_get_ephys_sweeps assertions to match actual function signatures
rly Jan 12, 2026
970e799
Fix test_get_ephys_sweeps to verify caching behavior correctly
rly Jan 12, 2026
942cbb2
Remove deprecated pep8/pytest-pep8 and update test dependencies
rly Jan 12, 2026
1f47802
Fix broken test_get_ephys_sweeps and disable coverage on Windows 3.13
rly Jan 12, 2026
dfa73f0
Enable parallel testing on Windows Python 3.12
rly Jan 12, 2026
291f8d5
Debug: Run tests in batches on Windows Python 3.13 to isolate segfault
rly Jan 12, 2026
216b0f8
Debug: Test basic pytest and imports on Windows Python 3.13
rly Jan 12, 2026
c370ac3
Exclude Windows Python 3.13 from CI due to numpy crash
rly Jan 12, 2026
920e6b7
Add accurate notes about numpy<2 and Python 3.13 compatibility
rly Jan 12, 2026
36c2a69
Fix pandas FutureWarnings in ecephys_project_cache
rly Jan 13, 2026
036e383
Apply suggestions from code review
rly Feb 12, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 16 additions & 7 deletions .github/workflows/github-actions-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ jobs:
name: Lint
runs-on: "ubuntu-latest"
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: flake8 linting
run: |
pip install flake8
Expand All @@ -43,15 +43,24 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: ["macos-latest", "windows-latest", "ubuntu-latest"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
# macos-15-intel for x86_64, macos-latest for ARM64
os: ["macos-15-intel", "macos-latest", "windows-latest", "ubuntu-latest"]
python-version: ["3.10", "3.11", "3.12", "3.13"]
exclude:
# Exclude Windows Python 3.13 because numpy<2 lacks support: conda-forge
# has no numpy 1.x builds, so conda falls back to an experimental
# MINGW-W64 build that segfaults on import; remove numpy<2 to enable it.
- os: windows-latest
python-version: "3.13"
fail-fast: false
defaults:
fail-fast: false
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v2
- uses: conda-incubator/setup-miniconda@v2
- uses: actions/checkout@v4
- uses: conda-incubator/setup-miniconda@v3
with:
auto-update-conda: true
python-version: ${{ matrix.python-version }}
Expand All @@ -71,9 +80,9 @@ jobs:
runs-on: ["self-hosted"]
strategy:
matrix:
image: ["allensdk_local_py38:latest"]
image: ["allensdk_local_py310:latest"]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: run test in docker
run: |
docker run \
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,10 @@ jobs:
runs-on: ["self-hosted"]
strategy:
matrix:
image: ["allensdk_local_py38:latest"]
image: ["allensdk_local_py310:latest"]
branch: ["master", "rc/**"]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
with:
ref: ${{ matrix.branch }}
- name: run test in docker
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/notebook_runner.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ jobs:
strategy:
matrix:
os: ["ubuntu-latest"]
python-version: ["3.8"]
python-version: ["3.10"]
fail-fast: false
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ def from_nwb(
cls,
nwbfile: NWBFile
) -> "RunningAcquisition":
running_module = nwbfile.modules['running']
running_module = nwbfile.processing['running']
dx_interface = running_module.get_data_interface('dx')

dx = dx_interface.data
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ def from_nwb(
nwbfile: NWBFile,
filtered=True
) -> "RunningSpeed":
running_module = nwbfile.modules['running']
running_module = nwbfile.processing['running']
interface_name = 'speed' if filtered else 'speed_unfiltered'
running_interface = running_module.get_data_interface(interface_name)

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import os
import warnings
import numpy as np
from pathlib import Path
from typing import Optional, List, Dict
Expand Down Expand Up @@ -184,17 +185,44 @@ def to_nwb(self, nwbfile: NWBFile,

nwbfile.add_stimulus_template(visual_stimulus_image_series)

# DEPRECATED (11/2025): _add_image_index_to_nwb is no longer called.
# The IndexSeries using indexed_timeseries is deprecated in pynwb 2.x
# and would require significant refactoring to use indexed_images instead.
# Since this NWB writing code will not be used for new data generation,
# we skip this step rather than refactor. The stimulus template data
# is still written above; only the IndexSeries linking is skipped.
# See: https://pynwb.readthedocs.io/en/stable/pynwb.image.html#pynwb.image.IndexSeries
if 'image_index' in stimulus_presentations.value \
and self._image_template_key is not None:
nwbfile = self._add_image_index_to_nwb(
nwbfile=nwbfile, presentations=stimulus_presentations)
warnings.warn(
"As of 11/2025, Templates.to_nwb() no longer adds the image "
"index (IndexSeries) to the NWB file. The IndexSeries "
"indexed_timeseries field is deprecated in pynwb 2.x. "
"The stimulus template data is still written.",
UserWarning,
stacklevel=2
)

return nwbfile

def _add_image_index_to_nwb(
self, nwbfile: NWBFile, presentations: Presentations):
"""Adds the image index and start_time for all stimulus templates
to NWB"""
to NWB

.. deprecated:: 2.16.3
This method is deprecated as of 11/2025. The IndexSeries
indexed_timeseries field is deprecated in pynwb 2.x. This method
is no longer called from to_nwb() and will be removed in a future
release.
"""
warnings.warn(
"_add_image_index_to_nwb is deprecated and will be removed in a "
"future release. The IndexSeries indexed_timeseries field is "
"deprecated in pynwb 2.x.",
DeprecationWarning,
stacklevel=2
)
stimulus_templates = self.value[self._image_template_key]
presentations = presentations.value

Expand All @@ -208,7 +236,7 @@ def _add_image_index_to_nwb(
image_index = IndexSeries(
name=nwb_template.name,
data=stimulus_index['image_index'].values,
unit='None',
unit='N/A',
Copy link

Copilot AI Jan 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The change from 'None' to 'N/A' for the IndexSeries unit may break compatibility with existing code or downstream consumers expecting 'None'. Since this method is deprecated, consider whether this change is necessary or if it should remain as 'None' to avoid breaking changes during the deprecation period.

Suggested change
unit='N/A',
unit='None',

Copilot uses AI. Check for mistakes.
Copy link
Author

@rly rly Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'unit' field of IndexSeries is fixed to the value 'N/A' in the NWB standard. Recent versions of NWB do not allow for other values. Changing this back to "None" results in files that will fail validation.

indexed_timeseries=nwb_template,
timestamps=stimulus_index['start_time'].values)
nwbfile.add_stimulus(image_index)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -93,10 +93,10 @@ def write_bytes(path: str, stream: Iterable[bytes]):
class AsyncHttpEngine(HttpEngine):

def __init__(
self,
scheme: str,
host: str,
session: Optional[aiohttp.ClientSession] = None,
self,
scheme: str,
host: str,
session: Optional[aiohttp.ClientSession] = None,
**kwargs
):
""" Simple tool for making asynchronous streaming http requests.
Expand All @@ -105,11 +105,11 @@ def __init__(
----------
scheme :
e.g "http" or "https"
host :
host :
will be used as the base for request urls
session :
If provided, this preconstructed session will be used rather than
a new one. Keep in mind that AsyncHttpEngine closes its session
session :
If provided, this preconstructed session will be used rather than
a new one. Keep in mind that AsyncHttpEngine closes its session
when it is garbage collected!
**kwargs :
Will be passed to parent.
Expand All @@ -119,14 +119,25 @@ def __init__(
super(AsyncHttpEngine, self).__init__(scheme, host, **kwargs)

if session:
self.session = session
self._session = session
self._owns_session = False
warnings.warn(
"Recieved preconstructed session, ignoring timeout parameter."
"Received preconstructed session, ignoring timeout parameter."
)
else:
self.session = aiohttp.ClientSession(
# Defer session creation until actually needed in an async context
# (aiohttp 3.9+ requires ClientSession to be created within an event loop)
self._session = None
self._owns_session = True

@property
def session(self) -> aiohttp.ClientSession:
"""Lazily create the aiohttp session when first accessed."""
if self._session is None:
self._session = aiohttp.ClientSession(
timeout=aiohttp.client.ClientTimeout(self.timeout)
)
return self._session

async def _stream_coroutine(
self,
Expand Down Expand Up @@ -169,10 +180,10 @@ def stream(
return functools.partial(self._stream_coroutine, route)

def __del__(self):
if hasattr(self, "session"):
if hasattr(self, "_session") and self._session is not None:
nest_asyncio.apply()
loop = asyncio.get_event_loop()
loop.run_until_complete(self.session.close())
loop.run_until_complete(self._session.close())

@staticmethod
def write_bytes(
Expand Down
9 changes: 6 additions & 3 deletions allensdk/brain_observatory/ecephys/ecephys_project_cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -737,9 +737,12 @@ def get_grouped_uniques(this, other, foreign_key, field_key, unique_key, inplace
if not inplace:
this = this.copy()

uniques = other.groupby(foreign_key)\
.apply(lambda grp: pd.DataFrame(grp)[field_key].unique())
this[unique_key] = 0
# Select only the field_key column before apply to avoid FutureWarning about
# grouping columns in pandas 2.2+
uniques = other.groupby(foreign_key)[field_key]\
.apply(lambda x: x.unique())
# Use object dtype to allow storing arrays of strings
this[unique_key] = None
this.loc[uniques.index.values, unique_key] = uniques.values

if not inplace:
Expand Down
4 changes: 3 additions & 1 deletion allensdk/brain_observatory/gaze_mapping/_gaze_mapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -289,7 +289,9 @@ def pupil_position_on_monitor_in_degrees(self,

mag = np.linalg.norm(self.monitor.position)
meridian = np.degrees(np.arctan(x / mag))
elevation = np.degrees(np.arctan(y / np.linalg.norm([x, mag], axis=0)))
# Use np.vstack to create a homogeneous 2D array (numpy 1.24+ compatibility)
elevation = np.degrees(np.arctan(y / np.linalg.norm(
np.vstack([x, np.full_like(x, mag, dtype=float)]), axis=0)))

angles = np.vstack([meridian, elevation]).T

Expand Down
6 changes: 3 additions & 3 deletions allensdk/brain_observatory/nwb/nwb_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,9 +59,9 @@ def get_running_speed(self, lowpass=True) -> RunningSpeed:
"""

interface_name = 'speed' if lowpass else 'speed_unfiltered'
values = self.nwbfile.modules['running'].get_data_interface(
values = self.nwbfile.processing['running'].get_data_interface(
interface_name).data[:]
timestamps = self.nwbfile.modules['running'].get_data_interface(
timestamps = self.nwbfile.processing['running'].get_data_interface(
interface_name).timestamps[:]

return RunningSpeed(
Expand All @@ -87,7 +87,7 @@ def get_image(self, name, module, image_api=None) -> sitk.Image:
if image_api is None:
image_api = ImageApi

nwb_img = self.nwbfile.modules[module].get_data_interface(
nwb_img = self.nwbfile.processing[module].get_data_interface(
'images')[name]
data = nwb_img.data
resolution = nwb_img.resolution # px/cm
Expand Down
12 changes: 10 additions & 2 deletions allensdk/brain_observatory/receptive_field_analysis/chisquarerf.py
Original file line number Diff line number Diff line change
Expand Up @@ -313,8 +313,16 @@ def interpolate_RF(rf_map, deg_per_pnt):
1,
)

interpolated = si.interp2d(x_coor, y_coor, rf_map)
interpolated = interpolated(x_interpolated, y_interpolated)
# interp2d was removed in scipy 1.14, use RectBivariateSpline instead
try:
interpolator = si.interp2d(x_coor, y_coor, rf_map)
interpolated = interpolator(x_interpolated, y_interpolated)
except NotImplementedError:
# RectBivariateSpline uses (row, col) ordering vs interp2d's (x, y),
# so arguments are swapped: (y_coor, x_coor) and (y_interpolated, x_interpolated)
# Use kx=ky=1 (linear) to match interp2d default behavior
interpolator = si.RectBivariateSpline(y_coor, x_coor, rf_map, kx=1, ky=1)
interpolated = interpolator(y_interpolated, x_interpolated)

return interpolated

Expand Down
5 changes: 3 additions & 2 deletions allensdk/config/app/application_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -351,8 +351,9 @@ def apply_configuration_from_file(self, config_file_path):
cfg_string = self.from_json_file(config_file_path)
try:
config.readfp(io.BytesIO(cfg_string))
except (NameError, TypeError):
config.read_string(cfg_string) # Python 3
except (NameError, TypeError, AttributeError):
# readfp was removed in Python 3.12
config.read_string(cfg_string)
else:
config.read(config_file_path)

Expand Down
Loading