Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Keras documentation #832

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 17 additions & 16 deletions docs/source/keras.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@ these packages need to be installed:

.. code-block:: bash

$ pip install tensorflow>=2.3.0
$ pip install scikeras>=0.1.8
$ pip install tensorflow>=2.4.0
$ pip install scikeras>=0.3.2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this required? Dask-ML will still work with the removed versions, right?

Copy link
Author

@adriangb adriangb May 7, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Things should work generally, but some of the syntax in this tutorial may not. I think we should either update the versions, or remove them altogether (since not specifying a version usually gets you the latest version anyway).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dask-ML will not work with SciKeras v0.1.7. I think that version didn't have serialization (?).

We should make a note about the versioning. "The example below uses X. The usage with lower versions may be different than this example."

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wasn't there also some issue about serialization of stateful optimizers like Adam?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll add a note along the lines of #832 (comment)

Wasn't there also some issue about serialization of stateful optimizers like Adam?

Yeah, we fixed that in v0.3.0, which is another good reason to bump the "recommended" version numbers in these docs, although I don't think we want to mention that here right?

Copy link
Member

@stsievert stsievert May 7, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we fixed [serialization] in v0.3.0

That's a really good reason to require SciKeras v0.3.0.


These are the minimum versions that Dask-ML requires to use Tensorflow/Keras.

Expand All @@ -36,24 +36,18 @@ normal way to create a `Keras Sequential model`_
from tensorflow.keras.layers import Dense
from tensorflow.keras.models import Sequential

def build_model(lr=0.01, momentum=0.9):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

def build_model():
layers = [Dense(512, input_shape=(784,), activation="relu"),
Dense(10, input_shape=(512,), activation="softmax")]
model = Sequential(layers)

opt = tf.keras.optimizers.SGD(
learning_rate=lr, momentum=momentum, nesterov=True,
)
model.compile(loss="categorical_crossentropy", optimizer=opt, metrics=["accuracy"])
return model
return Sequential(layers)

Now, we can use the SciKeras to create a Scikit-learn compatible model:

.. code-block:: python

from scikeras.wrappers import KerasClassifier
niceties = dict(verbose=False)
model = KerasClassifier(build_fn=build_model, lr=0.1, momentum=0.9, **niceties)
model = KerasClassifier(build_model, loss="categorical_crossentropy", optimizer=tf.keras.optimizers.SGD, **niceties)

This model will work with all of Dask-ML: it can use NumPy arrays as inputs and
obeys the Scikit-learn API. For example, it's possible to use Dask-ML to do the
Expand All @@ -63,12 +57,19 @@ following:
:class:`~dask_ml.model_selection.HyperbandSearchCV`.
* Use Keras with Dask-ML's :class:`~dask_ml.wrappers.Incremental`.

If we want to tune ``lr`` and ``momentum``, SciKeras requires that we pass
``lr`` and ``momentum`` at initialization:
If we want to tune SGD's ``learning_rate`` and ``momentum``, SciKeras requires that we pass
``learning_rate`` and ``momentum`` at initialization:

.. code-block::
.. code-block:: python

model = KerasClassifier(build_fn=build_model, lr=None, momentum=None, **niceties)
model = KerasClassifier(
build_model,
loss="categorical_crossentropy",
optimizer=tf.keras.optimizers.SGD,
optimizer__learning_rate=0.1,
optimizer__momentum=0.9,
**niceties
)

.. _SciKeras: https://github.com/adriangb/scikeras

Expand Down Expand Up @@ -101,7 +102,7 @@ And let's perform the basic task of tuning our SGD implementation:
.. code-block:: python

from scipy.stats import loguniform, uniform
params = {"lr": loguniform(1e-3, 1e-1), "momentum": uniform(0, 1)}
params = {"optimizer__learning_rate": loguniform(1e-3, 1e-1), "optimizer__momentum": uniform(0, 1)}
X, y = get_mnist()

Now, the search can be run:
Expand Down