We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The command to reproduce is:
lcdb test -id 6 -w lcdb.workflow.keras.DenseNNWorkflow -m -vs 42 -ts 42 -ws 42 --parameters '{"activation": "tanh", "activity_regularizer": "none", "batch_norm": false, "batch_size": 74, "bias_regularizer": "L1", "dropout_rate": 0.5235265140514189, "kernel_initializer": "random_uniform", "kernel_regularizer": "none", "learning_rate": 0.0010642794778408, "num_layers": 13, "num_units": 4, "optimizer": "Nadam", "regularizer_factor": 0.7423546250656852, "shuffle_each_epoch": false, "skip_co": true, "transform_cat": "onehot", "transform_real": "none"}'
The example trace is:
2023-12-22 22:56:01,852 - ERROR - controller.py:fit_workflow_on_current_anchor - Error while fitting the workflow: Traceback (most recent call last): File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/lcdb/controller.py", line 208, in fit_workflow_on_current_anchor self.workflow.fit( File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/lcdb/utils.py", line 67, in terminate_on_timeout return results.get(timeout) File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/build/dhenv/lib/python3.10/multiprocessing/pool.py", line 774, in get raise self._value File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/build/dhenv/lib/python3.10/multiprocessing/pool.py", line 125, in worker result = (True, func(*args, **kwds)) File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/lcdb/workflow/_base_workflow.py", line 31, in fit self._fit(X=X, y=y, metadata=metadata, *args, **kwargs) File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/lcdb/workflow/keras/_dense.py", line 299, in _fit y_valid_ = self._transformer_label.transform(y_valid) File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/build/dhenv/lib/python3.10/site-packages/sklearn/preprocessing/_label.py", line 137, in transform return _encode(y, uniques=self.classes_) File "/lus/grand/projects/datascience/regele/polaris/lcdb/publications/2023-neurips/build/dhenv/lib/python3.10/site-packages/sklearn/utils/_encode.py", line 232, in _encode raise ValueError(f"y contains previously unseen labels: {str(diff)}") ValueError: y contains previously unseen labels: [1, 3, 4, 6, 7, 9, 11, 13, 14, 16, 18, 19, 24]
The text was updated successfully, but these errors were encountered:
No branches or pull requests
The command to reproduce is:
lcdb test -id 6 -w lcdb.workflow.keras.DenseNNWorkflow -m -vs 42 -ts 42 -ws 42 --parameters '{"activation": "tanh", "activity_regularizer": "none", "batch_norm": false, "batch_size": 74, "bias_regularizer": "L1", "dropout_rate": 0.5235265140514189, "kernel_initializer": "random_uniform", "kernel_regularizer": "none", "learning_rate": 0.0010642794778408, "num_layers": 13, "num_units": 4, "optimizer": "Nadam", "regularizer_factor": 0.7423546250656852, "shuffle_each_epoch": false, "skip_co": true, "transform_cat": "onehot", "transform_real": "none"}'
The example trace is:
The text was updated successfully, but these errors were encountered: