Skip to content

Add support (and automated github workflow pytests) for newer python versions #89

@cleong110

Description

@cleong110

Recently I tried to install the library on a new machine and could not, despite trying various python versions. Could not get it working until python 3.8, and then still had to manually install something. But Python 3.8 is already at end-of-life status: https://devguide.python.org/versions/

It would be nice to:

  1. Support newer python versions
  2. Automatically check to make sure the library installs and runs with those versions.

what I did to test, and results:

conda create -n sldata python=3.12 # kept editing this part
conda activate sldata
pip install sign-language-datasets
# run python script that imports
# repeat

And I had the following results:

  • 3.12 error on pip install: no compatible tensorflow
  • 3.11 error on import: numpy.core.multiarray
  • 3.10 again, error on import
  • 3.9 same results (see below)
  • 3.8 ModuleNotFoundError: No module named 'webvtt', had to manually install webvtt-py. Also, when attempting to download dgs_corpus, had "ModuleNotFoundError: No module named 'lxml'"
Traceback (most recent call last):
  File "/opt/home/cleong/data_munging/ud-vlab/data_munging/sldata_download.py", line 4, in <module>
    import sign_language_datasets.datasets
  File "/opt/conda/envs/sldata/lib/python3.10/site-packages/sign_language_datasets/datasets/__init__.py", line 2, in <module>
    from .asl_lex import AslLex
  File "/opt/conda/envs/sldata/lib/python3.10/site-packages/sign_language_datasets/datasets/asl_lex/__init__.py", line 3, in <module>
    from .asl_lex import AslLex
  File "/opt/conda/envs/sldata/lib/python3.10/site-packages/sign_language_datasets/datasets/asl_lex/asl_lex.py", line 7, in <module>
    from ...datasets.config import SignDatasetConfig
  File "/opt/conda/envs/sldata/lib/python3.10/site-packages/sign_language_datasets/datasets/config.py", line 3, in <module>
    import cv2
  File "/opt/conda/envs/sldata/lib/python3.10/site-packages/cv2/__init__.py", line 8, in <module>
    from .cv2 import *
ImportError: numpy.core.multiarray failed to import

I also tried specifying various versions of opencv-python and numpy, no luck.

I tried installing it on Colab with no specs at all and got

ImportError: cannot import name 'get_dl_dirname' from 'tensorflow_datasets.core.download.resource' (/usr/local/lib/python3.11/dist-packages/tensorflow_datasets/core/download/resource.py)

Notebook here

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions