Skip to content

Commit

Permalink
Updated Load and Analyze Your Data with Autonomous Database workshop …
Browse files Browse the repository at this point in the history
…(freetier and livelabs) for ords 24.1 (#306)

* Update adw-dcat-integrate.md

* Update oracle-data-provider.md

* Update oracle-data-provider.md

* new lab

* Update manifest.json

* updates

* Update use-delta-sharing.md

* Update use-delta-sharing.md

* Update use-delta-sharing.md

* livelabs folder

* Update adw-dcat-integrate.md

* updates

* updates to Introduction lab

* Update introduction.md

* LiveLabs workshop version and other freetier changes

* Update introduction.md

* Update introduction.md

* livelabs and freetier changes

* Update load-analyze-json.md

* Update load-analyze-rest.md

* more changes

* more updates

* more updates

* Update adw-dcat-integrate.md

* changed shared with serverless as deployment type

* Update load-local-data.md

* update shared to serverless

* update adb-dcat workshop

* added new lab 12

* Update manifest.json

* added new folder with an .md file in it

Please work!

* more testing

* Delete introduction.md

* new workshop and labs wip

* update labs

* more updates

* more updates

* more updates

* updates

* more updates

* updates before review cycle

* Update endpoint.png

* Update setup-workshop-environment.md

* Update setup-workshop-environment.md

* more updates

* final update before review

* updates

* replacement code

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* updates

* Update manifest.json

* folder rename

* added content to data studio folder

* Delete user-bucket-credential-diagram.png

* updates self-qa

* Update introduction.md

* remove extra text files

* Update introduction.md

* Update setup-workshop-environment.md

* Data Studio Workshop Changes

* changes to data studio workshop

* Update setup-workshop-environment.md

* adb changes

* Update recipient-diagram.png

* diagram change

* Update user-bucket-credential-diagram.png

* SME feedback

* Update create-share.md

* Nilay changes

* changes

* Update consume-share.md

* Anoosha's feedback

* Update consume-share.md

* updated 2 screens and a sentence

* minor changes

* deleted extra images and added doc references

* new ECPU changes

* more changes to data sharing workshops

* more changes to fork (data studio)

* more changes

* Marty's feedback

* Marty's feedback to plsql workshop too

* Update setup-workshop-environment.md

* Delete 7381.png

* workshop # 3 ADB set up

and a couple of minor typos in workshops 1 and 2

* changes to adb-dcat workshop

* more changes

* minor typos in all 4 workshops

* quarterly qa build data lake

* new lab 11 in build DL with ADW

* minor changes database actions drop-down list

* final changes to build data lake workshop

* AI updates

AI workshop updates

* ai workshop updates

* Update query-using-select-ai.md

* Update query-using-select-ai.md

* updates

* more updates

* Update query-using-select-ai.md

* more new updates to ai workshop

* Update query-using-select-ai.md

* a new screen capture

* push Marty's feedback to fork

Final changes.

* updates sandbox manifest

* updates

* restored sandbox manifest

* Update setup-environment.md

* updates after CloudWorld

* final updates to ai workshop (also new labs 4 and 5)

* marty's feedback

* incorporated feedback

* minor PR edits by Sarah

* removed steps 7 & 8 Lab 2 > Task 3 per Alexey

The customer asked to remove this as it's not a requirement for the bucket to be public.

* more changes

* more changes per Alexey

* Update load-os-data-public.md

* Quarterly QA

I added a new step per the PM's request in the Data Sharing PL/SQL workshop. I also made a minor edit (removed space) in the Data Lake workshop.

* more updates

* Quarterly QA changes

* Update consume-share.md

* minor edit based on workshop user

* quarterly qa November 2023

* Added new videos to the workshop

Replaced 3 old silent videos with new ones. Added two new videos.

* Adding important notes to the two data sharing workshops

Per the PM's request.

* folder structure only  push to production

This push and the PR later is to make sure the folder structure is in the production repo before I start development. Only 1 .md file and the workshops folder.

* typos

* cloud links workshop

* UPDATES

* Update query-view.png

* update

* minor updates to chat ai workshop (Fork)

* test clones

* test pr

* Alexey's feedback

* Update data-sharing-diagram.png

* sarah's edits

* changes to Data Load UI

* removed script causing ML issue

* Update load-local-data.md

* updates: deprecated procedure and new code

* updates and test

* more updates

* minor update

* testing using a building block in a workshop

* updates

* building blocks debugging

* Update manifest.json

* fixing issues

* Update manifest.json

* delete cleanup.md from workshop folder (use common file)

* use common cleanup.md instead of local cleanup.md

* test common tasks

* update data sharing data studio workshop

* Update create-recipient.png

* PM's 1 feedback

* quarterly qa

* missing "Lab 2" from Manifest

* always free note addition

added a note

* always free change

* Update setup-environment.md

* update manage and monitor workshop

* Folder structure for new data share workshop (plus introduction.md)

* Updated Load and Analyze from clone

* Data Lake minor changes from clone

* manage and monitor workshop

* Remove the lab from the workshop per Marty's request

* mark-hornick-feedback

* used marty's setup file

* replaced notebook with a new one

* updates to lab 6 of manage and monitor

* Update adb-auto-scaling.md

* Nilay's feedback

* Update adb-auto-scaling.md

* updates to second ai workshop

* note change

* Changes to Load and Analyze workshop (other minor changes too)

---------

Co-authored-by: Michelle Malcher <[email protected]>
Co-authored-by: Sarah Hirschfeld <[email protected]>
  • Loading branch information
3 people authored Apr 16, 2024
1 parent e0b0af4 commit 8f7274e
Show file tree
Hide file tree
Showing 38 changed files with 51 additions and 44 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -262,4 +262,4 @@ You may now **proceed to the next lab**.
* Marty Gubar, ADB Product Management
* Richard Green, Principal Developer, Database User Assistance

- **Last Updated By/Date:** Lauran K. Serhal, March 2024
- **Last Updated By/Date:** Lauran K. Serhal, April 2024
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@ In this lab, you will:
* Load JSON data from Oracle Object Storage using the `DBMS_CLOUD.COPY_COLLECTION` procedure
* Use SQL to analyze both simple and complex JSON attributes


### Prerequisites

- This lab requires completion of the lab **Provision an Autonomous Database**, in the Contents menu on the left.
Expand Down Expand Up @@ -47,4 +46,4 @@ You may now proceed to the next lab.

* **Author** - Marty Gubar, Autonomous Database Product Management
* **Contributor:** Lauran K. Serhal, Consulting User Assistance Developer
* **Last Updated By/Date:** Lauran K. Serhal, March 2024
* **Last Updated By/Date:** Lauran K. Serhal, April 2024
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -48,19 +48,21 @@ Oracle MovieStream is a fictitious movie streaming service - similar to those th

3. On the **Autonomous Databases** page, click your ADB instance. Make sure you select the appropriate compartment from the **Compartment** drop-down list in the **List Scope** section.

![The Autonomous Database is displayed and highlighted.](./images/adb-page.png " ")

4. On the **Autonomous Database details** page, click the **Database actions** drop-down list, and then click **View all database actions**.
<if type="livelabs">
![The Autonomous Database is displayed and highlighted.](./images/ll-adb-page.png " ")
</if>

![On the partial Autonomous Database Details page, the Database Actions button is highlighted.](./images/click-db-actions.png " ")
<if type="freetier">
![The Autonomous Database is displayed and highlighted.](./images/adb-page.png " ")
</if>

5. The **Database Actions Launchpad** Home page is displayed _in a new tab_ in your browser. Scroll-down to the **Data Studio** section, and then click the **DATA LOAD** card.
4. Click the **Database actions** drop-down list, and then select **Data Load**.

![The Database Actions Launchpad page is displayed.](./images/launchpad-page.png =70%x*)
![Click data load from the database actions.](./images/click-data-load-drop-down.png =50%x*)

5. The **Data Load** Home page is displayed in a _**new tab in your browser**_.
5. The **Data Load** Home page is displayed in a _**new browser tab**_.

![The Data Load Home page is displayed.](./images/data-load-home.png =70%x*)
![Click the Data Load card.](./images/data-load-home-page-2.png =70%x*)

## Task 4: Load Data from the CSV Files Using the LOAD DATA Tool

Expand All @@ -78,8 +80,6 @@ In this task you will load the two .csv files that you downloaded earlier into t

>**Note:** If you have an issue uploading both files simultaneously, you can select one file at a time. Select the first downloaded file using step 3. When the file is uploaded, click the **Select Files** icon on the **Load Data** page, and then select the second file.
![Select the one file at a time.](./images/select-second-file.png " ")

4. When the upload is complete, you will make a small change to the default table name that will be created for the *customer-extension.csv* file. Click the **Settings** (pencil) icon to the right of *customer-extension.csv*.

![Update the data load job settings.](./images/click-settings.png " ")
Expand All @@ -88,33 +88,33 @@ In this task you will load the two .csv files that you downloaded earlier into t

![Examine the editor of the data load job.](./images/preview-table.png " ")

7. In the **Name** field, change the table name that will be created from **CUSTOMEREXTENSION** to **CUSTOMER\_EXTENSION**. Click **Close** in the lower right corner of the page.
6. In the **Name** field, change the table name that will be created from **CUSTOMEREXTENSION** to **CUSTOMER\_EXTENSION**. Click **Close** in the lower right corner of the page.

![Examine the editor of the data load job.](./images/change-table-name.png " ")

8. Click **Start**. A **Start Load from Local Files** confirmation dialog box is displayed. Click **Run**.
7. Click **Start**. A **Start Load from Local Files** confirmation dialog box is displayed. Click **Run**.

![Run the data load.](./images/click-start.png " ")

9. When the load job is complete, a green check mark appears next to each table. Click **Catalog** in the menu on the left.
8. When the load job is complete, a green check mark appears next to each table in the Data Load dashboard. Click **Catalog** in the menu on the left.

![Click Catalog in the menu on the left.](./images/click-catalog.png " ")
![Click Catalog in the menu on the left.](./images/load-completed.png " ")

> **Note:** If the menu on the left is collapsed, click the double arrows icon to expand it so that the label for each icon is displayed.
![Click Expand to expand the menu on the lef.](./images/expand-menu.png " ")

10. The Catalog displays the two newly created tables: *CUSTOMER\_SEGMENT* and *CUSTOMER\_EXTENSION*.
9. The Catalog displays the two newly created tables, *CUSTOMER\_SEGMENT* and *CUSTOMER\_EXTENSION*, at the top of the Data Load dashboard among the other tables that were created earlier.

![View the new table in the Catalog.](./images/display-new-tables.png " ")

You can click a table name to display its data. Click the *CUSTOMER\_SEGMENT* table to view the data.
You can click a table name link to display its data. Click the *CUSTOMER\_SEGMENT* table to view the data.

![Click customer_segment name link.](./images/customer-segment-link.png " ")

![Click customer_segment to display its data.](./images/customer-segment-data.png " ")

11. When finished, click **Close**, and then click the **Data Load** in the menu on the left. Click **Done**.
10. When finished, click **Close**, and then click the **Data Load** in the menu on the left to return to the **Data Load** page.

![Click Done.](./images/click-done.png " ")
![Click Done.](./images/return-data-load.png " ")

The **Data Load** page is re-displayed.

Expand All @@ -133,11 +133,11 @@ You may now proceed to the next lab.
* **Contributors:**
* Mike Matthews, Autonomous Database Product Management
* Marty Gubar, Autonomous Database Product Management
* **Last Updated By/Date:** Lauran K. Serhal, March 2024
* **Last Updated By/Date:** Lauran K. Serhal, April 2024

Data about movies in this workshop were sourced from Wikipedia.

Copyright (C) Oracle Corporation.
Copyright (c) 2024 Oracle Corporation.

Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.3
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -267,11 +267,15 @@ This task shows how to load data from Oracle Cloud Infrastructure Object Storage
+ **copy_data**: Loads the specified source file to a table. The table must already exist in ADW.
+ You will use this procedure to load tables to your admin schema with data from data files staged in the Oracle Cloud Infrastructure Object Storage cloud service.

1. Now that you've created the Cloud Location to connect to the Oracle Object Store, you're ready to load the `potential_churners.csv` file from your bucket. Navigate back to the main **Database Actions Launchpad** using the breadcrumb link in the upper left corner.
1. Now that you've created the Cloud Location to connect to the Oracle Object Store, you're ready to load the `potential_churners.csv` file from your bucket. Navigate back to the main **Database Actions Launchpad**. Click **Database Actions** in the banner to go to the Launchpad.

![Click Database Actions in the banner.](./images/click-database-actions.png =50%x*)

>**Note:** If you are prompted for username and password, enter the username `admin` and the password you created for `admin` when you created your autonomous database.
2. Under **Development**, click **SQL** to open SQL Worksheet.
2. On the Launchpad, click the **Development** tab, and then click the **SQL** tab to open SQL Worksheet.

![Navigate to the SQL worksheet.](./images/navigate-sql-worksheet.png =75%x*)

3. Unlike the earlier tasks where the Database Actions DATA LOAD tool gave you the option to automatically create the target Oracle Autonomous Database tables during the data load process, the following steps for loading with the `DBMS_CLOUD` package require you to first create the target tables.

Expand Down Expand Up @@ -405,7 +409,7 @@ Replace the provided example URL with the real object storage base URL that you

![Get logfile and badfile table names.](./images/log-and-table-names.png " ")

6. Query the log table to see detailed information about an individual load. In our example, the table name is `copy$2_log`.
6. Query the log table to see detailed information about an individual load. In our example, the table name is `copy$12_log`.

![Type the query and click Run Script.](./images/query-log-file.png " ")

Expand Down Expand Up @@ -440,7 +444,7 @@ Replace the provided example URL with the real object storage base URL that you
"trimspaces":"lrtrim",
"truncatecol":"true",
"ignoremissingcolumns":"true"
}',
}'
);
end;
/
Expand All @@ -453,8 +457,8 @@ Replace the provided example URL with the real object storage base URL that you

10. View the results by running this query:

<copy>
```
<copy>
SELECT *
FROM genre_debug;
</copy>
Expand All @@ -480,11 +484,11 @@ See the documentation [Loading Data with Autonomous Database](https://docs.oracl
* Lauran K. Serhal, Consulting User Assistance Developer
* Rick Green, Principal Developer

* **Last Updated By/Date** - Lauran K. Serhal, March 2024
* **Last Updated By/Date** - Lauran K. Serhal, April 2024

Data about movies in this workshop were sourced from Wikipedia.

Copyright (C) Oracle Corporation.
Copyright (c) 2024 Oracle Corporation.

Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.3
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,11 +46,11 @@ You may now **proceed to the next lab**.
## Acknowledgements

* **Authors:**
* Lauran K. Serhal, Consulting User Assistance Developer
* Mike Matthews, Autonomous Database Product Management
* Marty Gubar, Autonomous Database Product Management
* Lauran K. Serhal, Consulting User Assistance Developer
* **Contributor:** Rick Green, Database User Assistance
* **Last Updated By/Date:** Lauran K. Serhal, March 2024
* **Last Updated By/Date:** Lauran K. Serhal, April 2024

Data about movies in this workshop were sourced from Wikipedia.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,5 +43,5 @@ You may now **proceed to the next lab**.
## Acknowledgements

- **Author:** Lauran K. Serhal, Consulting User Assistance Developer
- **Last Updated By/Date:** - Lauran K. Serhal, March 2024
- **Last Updated By/Date:** - Lauran K. Serhal, April 2024
- **Built with Common Tasks**
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"db_ocpu": "2 ECPUs",
"db_storage": "1 TB",
"db_name_livelabs": "MOVIE+your user id",
"db_name_livelabs_example": "MOVIE2252",
"db_name_livelabs_example": "MOVIE81481",
"db_workload_type":"Autonomous Data Warehouse",
"oac_instance_name": "WORKSHOPADWOAC"
}
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ You can import a notebook from a local disk or from a remote location if you pro

If the import is successful, a notification is displayed and the **`ADB Speaks Human`** notebook is displayed in the list of available notebooks.

![The 1 out of 1 notebooks imported successfully message is displayed. The newly imported notebook name link is displayed and highlighted on the page.](./images/import-successful.png " ")
![The newly imported notebook is displayed.](./images/import-successful.png " ")

4. Open the imported notebook. Click the **ADB Speaks Human** notebook link. The notebook is displayed in the Notebook **Editor**. Read the paragraphs in this notebook.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -146,25 +146,29 @@ Data Catalog offers both aggregate and individual resource-types for writing pol

Create a Data Catalog instance using the following steps.

1. Open the **Navigation** menu and click **Analytics & AI**. Under **Data Lake**, click **Data Catalog**.
1. From the Console, open the **Navigation** menu.

![Click the Navigation menu.](./images/click-navigation-menu.png =70%x*)

2. Click **Analytics & AI**. Under **Data Lake**, click **Data Catalog**.

![From the Navigation menu, navigate to Data Catalog.](./images/navigate-data-catalog.png " ")
![From the Navigation menu, navigate to Data Catalog.](./images/navigate-data-catalog.png " ")

2. On the **Data Catalog Overview** page, click **Go to Data Catalogs**.
3. On the **Data Catalog Overview** page, click **Go to Data Catalogs**.

![The Go to Data Catalogs button is highlighted.](./images/data-catalog-overview.png " ")

3. On the **Data Catalogs** page, click **Create data catalog**.
4. On the **Data Catalogs** page, click **Create data catalog**.

![The Create Data Catalog button in the training-dcat-compartment is highlighted.](./images/data-catalog-page.png " ")

4. Select the **`training-dcat-compartment`** compartment from the **Create in compartment** drop-down list, if not already selected.
5. Select the **`training-dcat-compartment`** compartment from the **Create in compartment** drop-down list, if not already selected.

5. Enter **`training-dcat-instance`** in the **Name** field.
6. Enter **`training-dcat-instance`** in the **Name** field.

![The completed Create Data Catalog dialog box is displayed. The Create Data Catalog button is highlighted.](./images/create-data-catalog.png " ")

6. Click **Create data catalog**. The Data Catalog instance is created and displayed in the **Data Catalogs** page.
7. Click **Create data catalog**. The Data Catalog instance is created and displayed in the **Data Catalogs** page.

![The newly created Data Catalog instance is displayed with an Active state.](./images/click-data-catalog.png " ")

Expand Down

0 comments on commit 8f7274e

Please sign in to comment.