Skip to content

Commit

Permalink
remove empty cells
Browse files Browse the repository at this point in the history
  • Loading branch information
tcapelle committed Jul 26, 2024
1 parent 389bea7 commit 485b58b
Show file tree
Hide file tree
Showing 35 changed files with 0 additions and 714 deletions.
7 changes: 0 additions & 7 deletions colabs/catalyst/Catalyst_X_Wandb.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,6 @@
"<!--- @wandbcode{catalyst} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1woAYD9hot7mbknGhbxtix7x7u1fvIZJx?usp=sharing)"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand Down
59 changes: 0 additions & 59 deletions colabs/datasets-predictions/Logging_Timbre_Transfer_with_W&B.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,48 +8,6 @@
"<!--- @wandbcode{tables_whalesong} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<img src=\"http://wandb.me/logo-im-png\" width=\"400\" alt=\"Weights & Biases\" />\n",
"\n",
"<!--- @wandbcode{tables_whalesong} -->\n",
"\n",
"# Log timbre transfer audio experiments to W&B\n",
"\n",
"Given some input audio (a microphone recording or a file upload), resynthesize the melody of the audio as if it were played on a violin, flute, trumpet, or tenor sax. Log all your experiments to an interactive W&B Table for easy exploration and tuning.\n",
"\n",
"### Source Colab\n",
"\n",
"This notebook is a Weights & Biases integration and wrapper around the amazing [Timbre Transfer Demo with DDSP (Differentiable Digital Signal Processing) from Tensorflow Magenta](https://colab.research.google.com/github/magenta/ddsp/blob/master/ddsp/colab/demos/timbre_transfer.ipynb)\n",
"\n",
"# Timbre Transfer with Interactive Visualization\n",
"\n",
"The notebook processes audio input with timbre transfer, resynthesizing the melody using a model pretrained for various instruments (violin, flute, trumpet, etc). \n",
"\n",
"### [Explore an example with whale songs on W&B](https://wandb.ai/stacey/cshanty/reports/Whale2Song-W-B-Tables-for-Audio--Vmlldzo4NDI3NzM)\n",
"\n",
"<img src=\"https://i.imgur.com/T3vVzWZ.png\" height=400 alt=\"interactive audio wandb Table\"/></a>\n",
"\n",
"This notebook extracts features from input audio:\n",
"* uploaded files\n",
"* microphone recordings (to use this option, make sure to allow microphone access in your browser)\n",
"* URLs to sound files (hardcoded for this demo, feel free to edit the variable SONG_URL)\n",
"\n",
"The available models are trained to generate audio conditioned on a time series of fundamental frequency and loudness. The input audio, synthesized song, and visualizations of the signal will be uploaded to an interactive W&B Table. You can experiment with different recordings, instruments, and various audio settings (using sliders) in this notebook. All of this configuraion will be organized alongside the song versions in one W&B project.\n",
"\n",
"<img src=\"https://i.imgur.com/Jo3vrGm.png\" height=400 alt=\"interactive audio wandb Table\"/></a>\n",
"\n",
"## Additional Resources\n",
"* Full W&B Example: [Visualizing Audio Data with W&B Tables](https://wandb.ai/stacey/cshanty/reports/Whale2Song-W-B-Tables-for-Audio--Vmlldzo4NDI3NzM)\n",
"* [DDSP ICLR paper](https://openreview.net/forum?id=B1x1ma4tDr)\n",
"* [Audio Examples](http://goo.gl/magenta/ddsp-examples) \n",
"* marine mammal recordings from [Watkins Marine Mammal Sound Database](https://cis.whoi.edu/science/B/whalesounds/index.cfm), Woods Hole Oceanographic Institution\n",
"\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -623,23 +581,6 @@
"wandb.run.finish()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Source colab: Timbre transfer demo from Magenta\n",
"\n",
"This colab relies substantially on the following Timbre Transfer demo:\n",
"<a href=\"https://colab.research.google.com/github/magenta/ddsp/blob/master/ddsp/colab/demos/timbre_transfer.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
"\n",
"##### Copyright 2021 Google LLC.\n",
"\n",
"Licensed under the Apache License, Version 2.0 (the \"License\");\n",
"\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand Down
32 changes: 0 additions & 32 deletions colabs/gemini/How_to_use_Gemini_Pro_API_with_WB_Weave.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -398,38 +398,6 @@
"source": [
"await evaluation.evaluate(model)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ba244582",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"id": "c39dc979",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"id": "8d2f8124",
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"id": "f3012c72",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
8 changes: 0 additions & 8 deletions colabs/gemini/google-demo-rag.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -320,14 +320,6 @@
"# Not using Google Embeddings\n",
"model.predict(questions[1], contexts, False)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9b5f7c60-41dc-4fe2-b3fd-727ace6843be",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,6 @@
"<!--- @wandbcode{intro-colab-3-in-1} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a href=\"https://colab.research.google.com/github/wandb/examples/blob/master/colabs/intro/3_in_1_Intro_to_Weights_&_Biases_CV,_NLP_and_RL.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
"<!--- @wandbcode{intro-colab-3-in-1} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
10 changes: 0 additions & 10 deletions colabs/intro/Intro_to_Weights_&_Biases.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -389,16 +389,6 @@
"# Mark the run as finished (useful in Jupyter notebooks)\n",
"wandb.finish()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"# What's next 🚀 ?\n",
"The next tutorial you will learn how to do hyperparameter optimization using W&B Sweeps:\n",
"## 👉 [Hyperparameters sweeps using PyTorch](https://colab.research.google.com/github/wandb/examples/blob/master/colabs/pytorch/Organizing_Hyperparameter_Sweeps_in_PyTorch_with_W%26B.ipynb)"
]
}
],
"metadata": {
Expand Down
18 changes: 0 additions & 18 deletions colabs/intro/Intro_to_Weights_&_Biases_keras.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,6 @@
"<!--- @wandbcode{intro-colab-keras} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a href=\"https://colab.research.google.com/github/wandb/examples/blob/master/colabs/intro/Intro_to_Weights_&_Biases_keras.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
"<!--- @wandbcode{intro-colab-keras} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -297,16 +289,6 @@
"# Mark the run as finished (useful in Jupyter notebooks)\n",
"wandb.finish()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"# What's next 🚀 ?\n",
"The next tutorial you will learn how to do hyperparameter optimization using W&B Sweeps:\n",
"## 👉 [Hyperparameters sweeps using PyTorch](https://colab.research.google.com/github/wandb/examples/blob/master/colabs/pytorch/Organizing_Hyperparameter_Sweeps_in_PyTorch_with_W%26B.ipynb)"
]
}
],
"metadata": {
Expand Down
10 changes: 0 additions & 10 deletions colabs/intro/Report_API_Quickstart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -955,16 +955,6 @@
"report.save()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Can I build the report up from smaller pieces / all at once?\n",
"Yep. We'll demonstrate by putting together a report with a parallel coordinates plot.\n",
"\n",
"NOTE: this section assumes you have run the [sweeps notebook](https://colab.research.google.com/github/wandb/examples/blob/master/colabs/pytorch/Organizing_Hyperparameter_Sweeps_in_PyTorch_with_W%26B.ipynb) already."
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
26 changes: 0 additions & 26 deletions colabs/keras/Simple_Keras_Integration.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -623,32 +623,6 @@
"Click on the **W&B project page** link above to see your live results."
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# Whats Next? Hyperparameters with Sweeps\n",
"\n",
"We tried out two different hyperparameter settings by hand. You can use Weights & Biases Sweeps to automate hyperparameter testing and explore the space of possible models and optimization strategies.\n",
"\n",
"## [Check out Hyperparameter Optimization in TensorFlow uisng W&B Sweep $\\rightarrow$](https://colab.research.google.com/github/wandb/examples/blob/master/colabs/tensorflow/Hyperparameter_Optimization_in_TensorFlow_using_W&B_Sweeps.ipynb)\n",
"\n",
"Running a hyperparameter sweep with Weights & Biases is very easy. There are just 3 simple steps:\n",
"\n",
"1. **Define the sweep:** We do this by creating a dictionary or a [YAML file](https://docs.wandb.com/library/sweeps/configuration) that specifies the parameters to search through, the search strategy, the optimization metric et all.\n",
"\n",
"2. **Initialize the sweep:** \n",
"`sweep_id = wandb.sweep(sweep_config)`\n",
"\n",
"3. **Run the sweep agent:** \n",
"`wandb.agent(sweep_id, function=train)`\n",
"\n",
"And voila! That's all there is to running a hyperparameter sweep! In the notebook below, we'll walk through these 3 steps in more detail.\n",
"\n",
"<img src=\"https://imgur.com/UiQKg0L.png\" alt=\"Weights & Biases\" />\n"
]
},
{
"attachments": {},
"cell_type": "markdown",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,14 +38,6 @@
"<img src=\"http://wandb.me/mini-diagram\" width=\"650\" alt=\"Weights & Biases\" />"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"This colab notebook introduces the `WandbEvalCallback` which is an abstract callback that be inherited to build useful callbacks for model prediction visualization and dataset visualization. Refer to the [💫 `WandbEvalCallback`](https://colab.research.google.com/drive/107uB39vBulCflqmOWolu38noWLxAT6Be#scrollTo=u50GwKJ70WeJ&line=1&uniqifier=1) section for more details."
]
},
{
"attachments": {},
"cell_type": "markdown",
Expand Down
14 changes: 0 additions & 14 deletions colabs/keras/dreambooth/inference.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,20 +8,6 @@
"<!--- @wandbcode{dreambooth-keras-inference} -->"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# 🧨 Dreambooth-Keras + WandB 🪄🐝\n",
"\n",
"[![](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/soumik12345/dreambooth-keras/blob/main/notebooks/inference_wandb.ipynb)\n",
"\n",
"<!--- @wandbcode{dreambooth-keras-inference} -->\n",
"\n",
"This notebook shows how to perform inference with a DreamBooth fine-tuned Stable Diffusion model."
]
},
{
"attachments": {},
"cell_type": "markdown",
Expand Down
14 changes: 0 additions & 14 deletions colabs/keras/dreambooth/train.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,20 +8,6 @@
"<!--- @wandbcode{dreambooth-keras-train} -->"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# 🧨 Dreambooth-Keras + WandB 🪄🐝\n",
"\n",
"[![](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/soumik12345/dreambooth-keras/blob/main/notebooks/inference_wandb.ipynb)\n",
"\n",
"<!--- @wandbcode{dreambooth-keras-train} -->\n",
"\n",
"This notebook shows how to train a DreamBooth model for fine-tuning Stable Diffusion on a new visual concept."
]
},
{
"attachments": {},
"cell_type": "markdown",
Expand Down
22 changes: 0 additions & 22 deletions colabs/keras/keras_core/monai_medmnist_keras.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,21 +17,6 @@
"<!--- @wandbcode{keras_core_timm} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# 🩺 Medical Image Classification Tutorial using MonAI and Keras\n",
"\n",
"[![](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/wandb/examples/blob/master/colabs/keras/keras_core/monai_medmnist_keras.ipynb)\n",
"\n",
"This notebook demonstrates\n",
"- an end-to-end training using [MonAI](https://github.com/Project-MONAI/MONAI) and [KerasCore](https://github.com/keras-team/keras-core).\n",
"- how we can use the backend-agnostic Keras callbacks for [Weights & Biases](https://wandb.ai/site) to manage and track our experiment.\n",
"\n",
"Original Notebook: https://github.com/Project-MONAI/tutorials/blob/main/2d_classification/mednist_tutorial.ipynb"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -404,13 +389,6 @@
"\n",
"wandb.finish()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
13 changes: 0 additions & 13 deletions colabs/keras/keras_core/timm_keras.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,19 +17,6 @@
"<!--- @wandbcode{keras_core_timm} -->"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# 🔥 Fine-tune a [Timm](https://huggingface.co/docs/timm/index) Model with Keras and WandB 🦄\n",
"\n",
"[![](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/wandb/examples/blob/master/colabs/keras/keras_core/timm_keras.ipynb)\n",
"\n",
"This notebook demonstrates\n",
"- how we can fine-tune a pre-trained model from timm using [KerasCore](https://github.com/keras-team/keras-core).\n",
"- how we can use the backend-agnostic Keras callbacks for [Weights & Biases](https://wandb.ai/site) to manage and track our experiment."
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
Loading

0 comments on commit 485b58b

Please sign in to comment.