Fanwang Meng commited on
Commit
4bbf21d
·
1 Parent(s): 38eb05f

Rename `DiverseSelector` to `selector`

Browse files
.github/CODE_OF_CONDUCT.md CHANGED
@@ -1,4 +1,4 @@
1
  # Contributor Covenant Code of Conduct
2
 
3
- It is recommended to follow the code of conduct as described in
4
  https://qcdevs.org/guidelines/QCDevsCodeOfConduct/.
 
1
  # Contributor Covenant Code of Conduct
2
 
3
+ It is recommended to follow the code of conduct as described in
4
  https://qcdevs.org/guidelines/QCDevsCodeOfConduct/.
.pre-commit-config.yaml CHANGED
@@ -42,15 +42,16 @@ repos:
42
  # sorts entries in requirements.txt
43
  - id: requirements-txt-fixer
44
 
45
- # https://github.com/asottile/setup-cfg-fmt
46
- - repo: https://github.com/asottile/setup-cfg-fmt
47
- rev: v2.5.0
48
- hooks:
49
- - id: setup-cfg-fmt
 
50
 
51
  # - repo: https://github.com/psf/black-pre-commit-mirror
52
  - repo: https://github.com/psf/black
53
- rev: 23.10.0
54
  hooks:
55
  - id: black
56
  args: [
@@ -90,8 +91,10 @@ repos:
90
  ^docs/
91
  ^doc/
92
  )
93
- args: ["-c", "pyproject.toml"]
94
- additional_dependencies: ["bandit[toml]"]
 
 
95
 
96
  - repo: https://github.com/asottile/pyupgrade
97
  rev: v3.15.0
 
42
  # sorts entries in requirements.txt
43
  - id: requirements-txt-fixer
44
 
45
+ # we are not using this for now as we decided to use pyproject.toml instead
46
+ ## https://github.com/asottile/setup-cfg-fmt
47
+ #- repo: https://github.com/asottile/setup-cfg-fmt
48
+ # rev: v2.5.0
49
+ # hooks:
50
+ # - id: setup-cfg-fmt
51
 
52
  # - repo: https://github.com/psf/black-pre-commit-mirror
53
  - repo: https://github.com/psf/black
54
+ rev: 23.10.1
55
  hooks:
56
  - id: black
57
  args: [
 
91
  ^docs/
92
  ^doc/
93
  )
94
+ args: [
95
+ "-c", "pyproject.toml"
96
+ ]
97
+ additional_dependencies: [ "bandit[toml]" ]
98
 
99
  - repo: https://github.com/asottile/pyupgrade
100
  rev: v3.15.0
devtools/README.md DELETED
@@ -1,63 +0,0 @@
1
- # Development, testing, and deployment tools
2
-
3
- This directory contains a collection of tools for running Continuous Integration (CI) tests,
4
- conda installation, and other development tools not directly related to the coding process.
5
-
6
-
7
- ## Manifest
8
-
9
- ### Continuous Integration
10
-
11
- You should test your code, but do not feel compelled to use these specific programs. You also may not need Unix and
12
- Windows testing if you only plan to deploy on specific platforms. These are just to help you get started.
13
-
14
- The items in this directory have been left for legacy purposes since the change to GitHub Actions,
15
- They will likely be removed in a future version.
16
-
17
- * `legacy-miniconda-setup`: A preserved copy of a helper directory which made Linux and OSX based testing through [Travis-CI](https://about.travis-ci.com/) simpler
18
- * `before_install.sh`: Pip/Miniconda pre-package installation script for Travis. No longer needed thanks to
19
- [GitHub Actions](https://docs.github.com/en/free-pro-team@latest/actions) and the [conda-incubator/setup-miniconda Action](https://github.com/conda-incubator/setup-miniconda)
20
-
21
- ### Conda Environment:
22
-
23
- This directory contains the files to setup the Conda environment for testing purposes
24
-
25
- * `conda-envs`: directory containing the YAML file(s) which fully describe Conda Environments, their dependencies, and those dependency provenance's
26
- * `test_env.yaml`: Simple test environment file with base dependencies. Channels are not specified here and therefore respect global Conda configuration
27
-
28
- ### Additional Scripts:
29
-
30
- This directory contains OS agnostic helper scripts which don't fall in any of the previous categories
31
- * `scripts`
32
- * `create_conda_env.py`: Helper program for spinning up new conda environments based on a starter file with Python Version and Env. Name command-line options
33
-
34
-
35
- ## How to contribute changes
36
- - Clone the repository if you have write access to the main repo, fork the repository if you are a collaborator.
37
- - Make a new branch with `git checkout -b {your branch name}`
38
- - Make changes and test your code
39
- - Ensure that the test environment dependencies (`conda-envs`) line up with the build and deploy dependencies (`conda-recipe/meta.yaml`)
40
- - Push the branch to the repo (either the main or your fork) with `git push -u origin {your branch name}`
41
- * Note that `origin` is the default name assigned to the remote, yours may be different
42
- - Make a PR on GitHub with your changes
43
- - We'll review the changes and get your code into the repo after lively discussion!
44
-
45
-
46
- ## Checklist for updates
47
- - [ ] Make sure there is an/are issue(s) opened for your specific update
48
- - [ ] Create the PR, referencing the issue
49
- - [ ] Debug the PR as needed until tests pass
50
- - [ ] Tag the final, debugged version
51
- * `git tag -a X.Y.Z [latest pushed commit] && git push --follow-tags`
52
- - [ ] Get the PR merged in
53
-
54
- ## Versioneer Auto-version
55
- [Versioneer](https://github.com/warner/python-versioneer) will automatically infer what version
56
- is installed by looking at the `git` tags and how many commits ahead this version is. The format follows
57
- [PEP 440](https://www.python.org/dev/peps/pep-0440/) and has the regular expression of:
58
- ```regexp
59
- \d+.\d+.\d+(?\+\d+-[a-z0-9]+)
60
- ```
61
- If the version of this commit is the same as a `git` tag, the installed version is the same as the tag,
62
- e.g. `DiverseSelector-0.1.2`, otherwise it will be appended with `+X` where `X` is the number of commits
63
- ahead from the last tag, and then `-YYYYYY` where the `Y`'s are replaced with the `git` commit hash.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
devtools/conda-envs/test_env.yaml DELETED
@@ -1,18 +0,0 @@
1
- name: test
2
- channels:
3
-
4
- - defaults
5
- dependencies:
6
- # Base depends
7
- - python
8
- - pip
9
-
10
- # Testing
11
- - pytest
12
- - pytest-cov
13
- - codecov
14
-
15
- # Pip-only installs
16
- #- pip:
17
- # - codecov
18
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
devtools/legacy-miniconda-setup/before_install.sh DELETED
@@ -1,21 +0,0 @@
1
- # Temporarily change directory to $HOME to install software
2
- pushd .
3
- cd $HOME
4
- # Make sure some level of pip is installed
5
- python -m ensurepip
6
-
7
- if [ "$TRAVIS_OS_NAME" == "osx" ]; then
8
- HOMEBREW_NO_AUTO_UPDATE=1 brew upgrade pyenv
9
- # Pyenv requires minor revision, get the latest
10
- PYENV_VERSION=$(pyenv install --list |grep $PYTHON_VER | sed -n "s/^[ \t]*\(${PYTHON_VER}\.*[0-9]*\).*/\1/p" | tail -n 1)
11
- # Install version
12
- pyenv install -f $PYENV_VERSION
13
- # Use version for this
14
- pyenv global $PYENV_VERSION
15
- # Setup up path shims
16
- eval "$(pyenv init -)"
17
- fi
18
- pip install --upgrade pip setuptools
19
-
20
- # Restore original directory
21
- popd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
devtools/scripts/create_conda_env.py DELETED
@@ -1,95 +0,0 @@
1
- import argparse
2
- import os
3
- import re
4
- import glob
5
- import shutil
6
- import subprocess as sp
7
- from tempfile import TemporaryDirectory
8
- from contextlib import contextmanager
9
- # YAML imports
10
- try:
11
- import yaml # PyYAML
12
- loader = yaml.load
13
- except ImportError:
14
- try:
15
- import ruamel_yaml as yaml # Ruamel YAML
16
- except ImportError:
17
- try:
18
- # Load Ruamel YAML from the base conda environment
19
- from importlib import util as import_util
20
- CONDA_BIN = os.path.dirname(os.environ['CONDA_EXE'])
21
- ruamel_yaml_path = glob.glob(os.path.join(CONDA_BIN, '..',
22
- 'lib', 'python*.*', 'site-packages',
23
- 'ruamel_yaml', '__init__.py'))[0]
24
- # Based on importlib example, but only needs to load_module since its the whole package, not just
25
- # a module
26
- spec = import_util.spec_from_file_location('ruamel_yaml', ruamel_yaml_path)
27
- yaml = spec.loader.load_module()
28
- except (KeyError, ImportError, IndexError):
29
- raise ImportError("No YAML parser could be found in this or the conda environment. "
30
- "Could not find PyYAML or Ruamel YAML in the current environment, "
31
- "AND could not find Ruamel YAML in the base conda environment through CONDA_EXE path. "
32
- "Environment not created!")
33
- loader = yaml.YAML(typ="safe").load # typ="safe" avoids odd typing on output
34
-
35
-
36
- @contextmanager
37
- def temp_cd():
38
- """Temporary CD Helper"""
39
- cwd = os.getcwd()
40
- with TemporaryDirectory() as td:
41
- try:
42
- os.chdir(td)
43
- yield
44
- finally:
45
- os.chdir(cwd)
46
-
47
-
48
- # Args
49
- parser = argparse.ArgumentParser(description='Creates a conda environment from file for a given Python version.')
50
- parser.add_argument('-n', '--name', type=str,
51
- help='The name of the created Python environment')
52
- parser.add_argument('-p', '--python', type=str,
53
- help='The version of the created Python environment')
54
- parser.add_argument('conda_file',
55
- help='The file for the created Python environment')
56
-
57
- args = parser.parse_args()
58
-
59
- # Open the base file
60
- with open(args.conda_file, "r") as handle:
61
- yaml_script = loader(handle.read())
62
-
63
- python_replacement_string = "python {}*".format(args.python)
64
-
65
- try:
66
- for dep_index, dep_value in enumerate(yaml_script['dependencies']):
67
- if re.match('python([ ><=*]+[0-9.*]*)?$', dep_value): # Match explicitly 'python' and its formats
68
- yaml_script['dependencies'].pop(dep_index)
69
- break # Making the assumption there is only one Python entry, also avoids need to enumerate in reverse
70
- except (KeyError, TypeError):
71
- # Case of no dependencies key, or dependencies: None
72
- yaml_script['dependencies'] = []
73
- finally:
74
- # Ensure the python version is added in. Even if the code does not need it, we assume the env does
75
- yaml_script['dependencies'].insert(0, python_replacement_string)
76
-
77
- # Figure out conda path
78
- if "CONDA_EXE" in os.environ:
79
- conda_path = os.environ["CONDA_EXE"]
80
- else:
81
- conda_path = shutil.which("conda")
82
- if conda_path is None:
83
- raise RuntimeError("Could not find a conda binary in CONDA_EXE variable or in executable search path")
84
-
85
- print("CONDA ENV NAME {}".format(args.name))
86
- print("PYTHON VERSION {}".format(args.python))
87
- print("CONDA FILE NAME {}".format(args.conda_file))
88
- print("CONDA PATH {}".format(conda_path))
89
-
90
- # Write to a temp directory which will always be cleaned up
91
- with temp_cd():
92
- temp_file_name = "temp_script.yaml"
93
- with open(temp_file_name, 'w') as f:
94
- f.write(yaml.dump(yaml_script))
95
- sp.call("{} env create -n {} -f {}".format(conda_path, args.name, temp_file_name), shell=True)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
pyproject.toml ADDED
@@ -0,0 +1,380 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # [build-system]
2
+ # build-backend = "setuptools.build_meta"
3
+ # requires = ["setuptools>=61.0", "wheel>=0.37.1"]
4
+ # This example pyproject.toml is for a basic pip+setuptools setup.
5
+ # If you use a project management tool (like Poetry), then
6
+ # those tools will have slightly different configurations or additions.
7
+
8
+ # I highly recommend using a project management tool for your project.
9
+ # Project management is a highly opinionated subject.
10
+ # There are a lot of good, robust tools in this space now (as of 2023)
11
+ # Two that I've used and recommend are Poetry and PDM.
12
+ # Poetry is more mature, PDM is recent, both work well.
13
+ # - Poetry: https://python-poetry.org/
14
+ # - PDM: https://pdm.fming.dev/latest/
15
+
16
+ # https://gist.githubusercontent.com/GhostofGoes/75051c4aeb215bc3cf48c10f5454b399/raw/499661609846ab52b8fc7e6acf0562275ca22517/pyproject.toml
17
+ # Resources
18
+ #
19
+ # - https://packaging.python.org/en/latest/tutorials/packaging-projects/
20
+ # - https://betterprogramming.pub/a-pyproject-toml-developers-cheat-sheet-5782801fb3ed
21
+ #
22
+ # Examples of pyproject.toml files from open-source projects:
23
+ # - https://github.com/carlosperate/awesome-pyproject
24
+ # - https://github.com/SpikeInterface/spikeinterface/blob/master/pyproject.toml
25
+ # - https://github.com/codespell-project/codespell/blob/master/pyproject.toml
26
+ # - https://github.com/hynek/structlog/blob/main/pyproject.toml
27
+
28
+
29
+ [project]
30
+ # https://packaging.python.org/en/latest/specifications/declaring-project-metadata/
31
+ name = "acme"
32
+ version = "0.0.1"
33
+ description = "ACME tool"
34
+ readme = "README.md"
35
+ requires-python = ">=3.9,<4.0"
36
+ # "LICENSE" is name of the license file, which must be in root of project folder
37
+ license = {file = "LICENSE"}
38
+ authors = [
39
+ {name = "Joe Example", email = "[email protected]"},
40
+ ]
41
+ keywords = ["keyword", "are", "cool"]
42
+
43
+ # https://pypi.org/classifiers/
44
+ # Add PyPI classifiers here
45
+ classifiers = [
46
+ "Development Status :: 5 - Production/Stable",
47
+ "Environment :: Console",
48
+ "License :: OSI Approved :: MIT License",
49
+ "Natural Language :: English",
50
+ "Operating System :: MacOS",
51
+ "Operating System :: Microsoft :: Windows",
52
+ "Operating System :: POSIX",
53
+ "Operating System :: POSIX :: Linux",
54
+ "Programming Language :: Python",
55
+ "Programming Language :: Python :: 3",
56
+ "Programming Language :: Python :: 3.9",
57
+ "Programming Language :: Python :: 3.10",
58
+ "Programming Language :: Python :: 3.11",
59
+ "Programming Language :: Python :: Implementation :: CPython",
60
+ ]
61
+
62
+ # Required dependencies for install/usage of your package or application
63
+ # If you don't have any dependencies, leave this section empty
64
+ # Format for dependency strings: https://peps.python.org/pep-0508/
65
+ dependencies = [
66
+ # ... put stuff here ...
67
+ ]
68
+
69
+ [project.scripts]
70
+ # Command line interface entrypoint scripts
71
+ acme = "acme.__main__:main"
72
+
73
+ [project.urls]
74
+ # Use PyPI-standard names here
75
+ # Homepage
76
+ # Documentation
77
+ # Changelog
78
+ # Issue Tracker
79
+ # Source
80
+ # Discord server
81
+ "Homepage" = "<url>"
82
+ "Documentation" = "<url>"
83
+
84
+ # Development dependencies
85
+ # pip install -e .[lint,test,exe]
86
+ # pip install -e .[dev]
87
+ [project.optional-dependencies]
88
+ lint = [
89
+ # checks for spelling mistakes
90
+ "codespell>=2.2.4",
91
+
92
+ # ruff linter checks for issues and potential bugs
93
+ "ruff",
94
+
95
+ # checks for unused code
96
+ "vulture",
97
+
98
+ # required for codespell to parse pyproject.toml
99
+ "tomli",
100
+
101
+ # validation of pyproject.toml
102
+ "validate-pyproject[all]",
103
+
104
+ # automatic sorting of imports
105
+ "isort",
106
+
107
+ # automatic code formatting to follow a consistent style
108
+ "black",
109
+ ]
110
+
111
+ test = [
112
+ # Handles most of the testing work, including execution
113
+ # Docs: https://docs.pytest.org/en/stable/contents.html
114
+ "pytest",
115
+
116
+ # "Coverage" is how much of the code is actually run (it's "coverage")
117
+ # Generates coverage reports from test suite runs
118
+ "pytest-cov",
119
+ "tomli",
120
+
121
+ # pytest wrapper around the "mock" library
122
+ "pytest-mock",
123
+
124
+ # Randomizes the order of test execution
125
+ "pytest-randomly",
126
+
127
+ # Required for comparing device data exports and a few other complex structures
128
+ # Docs: https://zepworks.com/deepdiff/current/
129
+ "deepdiff",
130
+
131
+ # Test parallelization, as well as remote execution (which we may do someday)
132
+ "pytest-xdist[psutil]",
133
+
134
+ # Detailed pytest results saved to a JSON file
135
+ "pytest-json-report",
136
+
137
+ # Better parsing of doctests
138
+ "xdoctest",
139
+
140
+ # Mocking of HTTP responses for the "requests" module
141
+ "requests-mock",
142
+
143
+ # Colors for doctest output
144
+ "Pygments",
145
+ ]
146
+
147
+ exe = [
148
+ "setuptools",
149
+ "wheel",
150
+ "build",
151
+ "tomli",
152
+ "pyinstaller",
153
+ "staticx;platform_system=='Linux'",
154
+ ]
155
+
156
+ dev = [
157
+ # https://hynek.me/articles/python-recursive-optional-dependencies/
158
+ "acme[lint,test,exe]",
159
+
160
+ # Useful for building quick scripts for testing purposes
161
+ # https://github.com/google/python-fire
162
+ "fire",
163
+
164
+ # Code quality tools
165
+ "mypy",
166
+
167
+ # Improved exception traceback output
168
+ # https://github.com/qix-/better-exceptions
169
+ "better_exceptions",
170
+
171
+ # Analyzing dependencies
172
+ # install graphviz to generate graphs
173
+ "graphviz",
174
+ "pipdeptree",
175
+ ]
176
+
177
+
178
+ [tool.setuptools]
179
+ # https://setuptools.pypa.io/en/latest/userguide/pyproject_config.html
180
+ platforms = ["Linux", "Windows"]
181
+ include-package-data = true
182
+ zip-safe = true # This just means it's safe to zip up the bdist
183
+
184
+ # Non-code data that should be included in the package source code
185
+ # https://setuptools.pypa.io/en/latest/userguide/datafiles.html
186
+ [tool.setuptools.package-data]
187
+ acme = ["*.xml"]
188
+
189
+ # Python modules and packages that are included in the
190
+ # distribution package (and therefore become importable)
191
+ [tool.setuptools.packages.find]
192
+ exclude = ["tests", "tests.*", "examples"]
193
+
194
+
195
+
196
+ # PDM example
197
+ [tool.pdm.scripts]
198
+ isort = "isort acme"
199
+ black = "black acme"
200
+ format = {composite = ["isort", "black"]}
201
+ check_isort = "isort --check acme tests"
202
+ check_black = "black --check acme tests"
203
+ vulture = "vulture --min-confidence 100 acme tests"
204
+ ruff = "ruff check acme tests"
205
+ fix = "ruff check --fix acme tests"
206
+ codespell = "codespell --toml ./pyproject.toml"
207
+ lint = {composite = ["vulture", "codespell", "ruff", "check_isort", "check_black"]}
208
+
209
+
210
+
211
+ [tool.codespell]
212
+ # codespell supports pyproject.toml since version 2.2.2
213
+ # NOTE: the "tomli" package must be installed for this to work
214
+ # https://github.com/codespell-project/codespell#using-a-config-file
215
+ # NOTE: ignore words for codespell must be lowercase
216
+ check-filenames = ""
217
+ ignore-words-list = "word,another,something"
218
+ skip = "htmlcov,.doctrees,*.pyc,*.class,*.ico,*.out,*.PNG,*.inv,*.png,*.jpg,*.dot"
219
+
220
+
221
+ [tool.black]
222
+ line-length = 88
223
+ # If you need to exclude directories from being reformatted by black
224
+ # force-exclude = '''
225
+ # (
226
+ # somedirname
227
+ # | dirname
228
+ # | filename\.py
229
+ # )
230
+ # '''
231
+
232
+
233
+ [tool.isort]
234
+ profile = "black"
235
+ known_first_party = ["acme"]
236
+ # If you need to exclude files from having their imports sorted
237
+ extend_skip_glob = [
238
+ "acme/somefile.py",
239
+ "acme/somedir/*",
240
+ ]
241
+
242
+
243
+ # https://beta.ruff.rs/docs
244
+ [tool.ruff]
245
+ line-length = 99
246
+ show-source = true
247
+
248
+ # Rules: https://beta.ruff.rs/docs/rules
249
+ # If you violate a rule, lookup the rule on the Rules page in ruff docs.
250
+ # Many rules have links you can click with a explanation of the rule and how to fix it.
251
+ # If there isn't a link, go to the project the rule was source from (e.g. flake8-bugbear)
252
+ # and review it's docs for the corresponding rule.
253
+ # If you're still confused, ask a fellow developer for assistance.
254
+ # You can also run "ruff rule <rule>" to explain a rule on the command line, without a browser or internet access.
255
+ select = [
256
+ "E", # pycodestyle
257
+ "F", # Pyflakes
258
+ "W", # Warning
259
+ "B", # flake8-bugbear
260
+ "A", # flake8-builtins
261
+ "C4", # flake8-comprehensions
262
+ "T10", # flake8-debugger
263
+ "EXE", # flake8-executable,
264
+ "ISC", # flake8-implicit-str-concat
265
+ "G", # flake8-logging-format
266
+ "PIE", # flake8-pie
267
+ "T20", # flake8-print
268
+ "PT", # flake8-pytest-style
269
+ "RSE", # flake8-raise
270
+ "RET", # flake8-return
271
+ "TID", # flake8-tidy-imports
272
+ "ARG", # flake8-unused-arguments
273
+ "PGH", # pygrep-hooks
274
+ "PLC", # Pylint Convention
275
+ "PLE", # Pylint Errors
276
+ "PLW", # Pylint Warnings
277
+ "RUF", # Ruff-specific rules
278
+
279
+ # ** Things to potentially enable in the future **
280
+ # DTZ requires all usage of datetime module to have timezone-aware
281
+ # objects (so have a tz argument or be explicitly UTC).
282
+ # "DTZ", # flake8-datetimez
283
+ # "PTH", # flake8-use-pathlib
284
+ # "SIM", # flake8-simplify
285
+ ]
286
+
287
+ # Files to exclude from linting
288
+ extend-exclude = [
289
+ "*.pyc",
290
+ "__pycache__",
291
+ ]
292
+
293
+ # Linting error codes to ignore
294
+ ignore = [
295
+ "F403", # unable to detect undefined names from star imports
296
+ "F405", # undefined locals from star imports
297
+ "W605", # invalid escape sequence
298
+ "A003", # shadowing python builtins
299
+ "RET505", # unnecessary 'else' after 'return' statement
300
+ "RET504", # Unnecessary variable assignment before return statement
301
+ "RET507", # Unnecessary {branch} after continue statement
302
+ "PT011", # pytest-raises-too-broad
303
+ "PT012", # pytest.raises() block should contain a single simple statement
304
+ "PLW0603", # Using the global statement to update is discouraged
305
+ "PLW2901", # for loop variable overwritten by assignment target
306
+ "G004", # Logging statement uses f-string
307
+ "PIE790", # no-unnecessary-pass
308
+ "PIE810", # multiple-starts-ends-with
309
+ "PGH003", # Use specific rule codes when ignoring type issues
310
+ "PLC1901", # compare-to-empty-string
311
+ ]
312
+
313
+ # Linting error codes to ignore on a per-file basis
314
+ [tool.ruff.per-file-ignores]
315
+ "__init__.py" = ["F401", "E501"]
316
+ "acme/somefile.py" = ["E402", "E501"]
317
+ "acme/somedir/*" = ["E501"]
318
+
319
+
320
+ # Configuration for mypy
321
+ # https://mypy.readthedocs.io/en/stable/config_file.html#using-a-pyproject-toml-file
322
+ [tool.mypy]
323
+ python_version = "3.9"
324
+ follow_imports = "skip"
325
+ ignore_missing_imports = true
326
+ files = "acme" # directory mypy should analyze
327
+ # Directories to exclude from mypy's analysis
328
+ exclude = [
329
+ "acme/somedir",
330
+ "acme/somefile\\.py",
331
+ "dirname",
332
+ ]
333
+
334
+
335
+ # Configuration for pytest
336
+ # https://docs.pytest.org/en/latest/reference/customize.html#pyproject-toml
337
+ [tool.pytest.ini_options]
338
+ testpaths = "tests" # directory containing your tests
339
+ norecursedirs = [
340
+ ".vscode",
341
+ "__pycache__"
342
+ ]
343
+ # Warnings that should be ignored
344
+ filterwarnings = [
345
+ "ignore::DeprecationWarning"
346
+ ]
347
+ # custom markers that can be used using pytest.mark
348
+ markers = [
349
+ "slow: lower-importance tests that take an excessive amount of time",
350
+ ]
351
+
352
+
353
+ # Configuration for coverage.py
354
+ [tool.coverage.run]
355
+ # files or directories to exclude from coverage calculations
356
+ omit = [
357
+ 'acme/somedir/*',
358
+ 'acme/somefile.py',
359
+ ]
360
+
361
+
362
+ # Configuration for vulture
363
+ [tool.vulture]
364
+ # Files or directories to exclude from vulture
365
+ # The syntax is a little funky
366
+ exclude = [
367
+ "somedir",
368
+ "*somefile.py",
369
+ ]
370
+
371
+ # configuration for bandit
372
+ [tool.bandit]
373
+ # skips some bandit tests
374
+ skips=[
375
+ "B101", # Ignore defensive `assert`s, 80% of repos do this
376
+ "B311", # Standard pseudo-random generators are not suitable for security/cryptographic purposes
377
+ "B404", # Ignore warnings about importing subprocess
378
+ "B603", # Ignore warnings about calling subprocess
379
+ "B607", # Ignore warnings about calling subprocess
380
+ ]
setup.cfg DELETED
@@ -1,28 +0,0 @@
1
- [metadata]
2
- name = selector
3
- long_description = file: README.md
4
- long_description_content_type = text/markdown
5
- license = GPL-3.0
6
- license_files = LICENSE
7
- classifiers =
8
- License :: OSI Approved :: GNU General Public License v3 (GPLv3)
9
- Programming Language :: Python :: 3
10
- Programming Language :: Python :: 3 :: Only
11
- Programming Language :: Python :: Implementation :: CPython
12
-
13
- [options]
14
- python_requires = >=3.9
15
-
16
- [coverage:run]
17
- omit =
18
- selector/methods/tests/*
19
- */tests/*
20
- */_version.py
21
-
22
- [yapf]
23
- COLUMN_LIMIT = 109
24
- INDENT_WIDTH = 4
25
- USE_TABS = False
26
-
27
- [aliases]
28
- test = pytest
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
setup.py CHANGED
@@ -1,4 +1,3 @@
1
- # -*- coding: utf-8 -*-
2
  # The Selector library provides a set of tools for selecting a
3
  # subset of the dataset and computing diversity.
4
  #
@@ -33,7 +32,7 @@ needs_pytest = {"pytest", "test", "ptr"}.intersection(sys.argv)
33
  pytest_runner = ["pytest-runner"] if needs_pytest else []
34
 
35
  try:
36
- with open("README.md", "r") as handle:
37
  long_description = handle.read()
38
  except ValueError:
39
  long_description = short_description
@@ -64,7 +63,7 @@ setup(
64
  ]
65
  + pytest_runner,
66
  # Additional entries you may want simply uncomment the lines you want and fill in the data
67
- url="https://github.com/theochem/selector", # Website
68
  install_requires=[
69
  "numpy>=1.21.2",
70
  "scipy==1.11.1",
 
 
1
  # The Selector library provides a set of tools for selecting a
2
  # subset of the dataset and computing diversity.
3
  #
 
32
  pytest_runner = ["pytest-runner"] if needs_pytest else []
33
 
34
  try:
35
+ with open("README.md") as handle:
36
  long_description = handle.read()
37
  except ValueError:
38
  long_description = short_description
 
63
  ]
64
  + pytest_runner,
65
  # Additional entries you may want simply uncomment the lines you want and fill in the data
66
+ url="https://github.com/theochem/Selector", # Website
67
  install_requires=[
68
  "numpy>=1.21.2",
69
  "scipy==1.11.1",
tox.ini CHANGED
@@ -79,11 +79,11 @@ deps =
79
  # black
80
  bandit
81
  commands =
82
- flake8 selector/ selector/tests setup.py
83
  pylint selector --rcfile=tox.ini --disable=similarities
84
- # black -l 100 --check ./
85
- # black -l 100 --diff ./
86
- # Use bandit configuration file
87
  bandit -r selector -c .bandit.yml
88
 
89
  ignore_errors = true
 
79
  # black
80
  bandit
81
  commands =
82
+ flake8 selector/ selector/test setup.py
83
  pylint selector --rcfile=tox.ini --disable=similarities
84
+ # black -l 100 --check ./
85
+ # black -l 100 --diff ./
86
+ # Use bandit configuration file
87
  bandit -r selector -c .bandit.yml
88
 
89
  ignore_errors = true
website.yml CHANGED
@@ -23,10 +23,10 @@ jobs:
23
  - uses: actions/checkout@v2
24
 
25
  # Install dependencies
26
- - name: Set up Python 3.8
27
- uses: actions/setup-python@v2
28
  with:
29
- python-version: 3.8
30
 
31
  - name: Install dependencies
32
  run: |
@@ -37,12 +37,12 @@ jobs:
37
  run: |
38
  jupyter-book build ./book/content
39
  # Create an ghp-branch
40
- - name ghp branch
41
  run: |
42
  ghp-import -n -p -f ./book/content/_build/html
43
 
44
  # Push the book's HTML to github-pages
45
- - name: GitHub Pages action
46
  uses: peaceiris/[email protected]
47
  with:
48
  github_token: ${{ secrets.GITHUB_TOKEN }}
 
23
  - uses: actions/checkout@v2
24
 
25
  # Install dependencies
26
+ - name: Set up Python 3.11
27
+ uses: actions/setup-python@v4
28
  with:
29
+ python-version: 3.11
30
 
31
  - name: Install dependencies
32
  run: |
 
37
  run: |
38
  jupyter-book build ./book/content
39
  # Create an ghp-branch
40
+ - name: ghp branch
41
  run: |
42
  ghp-import -n -p -f ./book/content/_build/html
43
 
44
  # Push the book's HTML to github-pages
45
+ - name: GitHub Pages Action
46
  uses: peaceiris/[email protected]
47
  with:
48
  github_token: ${{ secrets.GITHUB_TOKEN }}