Skip to content

Commit

Permalink
Merge pull request #315 from threedworld-mit/v1.9_links
Browse files Browse the repository at this point in the history
Fixed documentation links
  • Loading branch information
alters-mit authored Dec 14, 2021
2 parents 4134426 + c34efe2 commit 47f1d6c
Show file tree
Hide file tree
Showing 8 changed files with 9 additions and 9 deletions.
2 changes: 1 addition & 1 deletion Documentation/lessons/3d_models/custom_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ It is possible to add any 3D model to TDW. However, the underlying Unity engine
## Requirements

- Windows 10, OS X, or Linux
- (Windows only) [Visual C++ 2012 Redistributable](https://www.microsoft.com/en-us/download/confirmation.aspx?id=30679)
- (Windows only) Visual C++ 2012 Redistributable
- The `tdw` module
- Python 3.6+
- Unity Hub
Expand Down
2 changes: 1 addition & 1 deletion Documentation/lessons/audio/record_audio.md
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ This controller will output two files per trial:
Example controllers:

- [minimal_audio_dataset.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/audio/minimal_audio_dataset.py) A minimal example of how to record a physics audio dataset using `AudioInitializer`, `PyImpact`, and `PhysicsAudioRecorder`.
- [scrape.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/audio/impact_and_scrape.py) Record scrape sounds.
- [scrape.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/audio/scrape.py) Record scrape sounds.
- [rube_goldberg.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/audio/rube_goldberg.py)

Python API:
Expand Down
2 changes: 1 addition & 1 deletion Documentation/lessons/humans/keyboard.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ In order to use keyboard controls, the TDW build window must be focused (i.e. be

## Control an avatar with the keyboard

In this example, the human user can control an [`EmbodiedAvatar`](../embodied_avatars/overview.md) with the keyboard.
In this example, the human user can control an [`EmbodiedAvatar`](../embodied_avatars/embodied_avatar.md) with the keyboard.

In order to listen for both *key presses* and *key holds*, we'll set the *events* parameter like this:

Expand Down
4 changes: 2 additions & 2 deletions Documentation/lessons/misc/c_sharp_sources.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,10 @@ Your experience should include most, if not all, of the following:
Interfacing TDW with OpenAIGym (or similar RL toolkits) can be done completely through the Python API. This has been done already on several projects.

#### "I want to write my own scene setup tools / use my own custom scene data format"
The TDW Python API can handle a very broad range of scene setup scenarios, both procedural and explicitly scripted, including parsing custom data formats There are a number of examples of how to set up scenes in our Example Controllers, including deserializing JSON files containing scene setup data. Our [Rube Goldberg Demo](https://github.com/threedworld-mit/tdw/blob/master/Documentation/python/use_cases/rube_goldberg.md) is a good example of this; while the controller does make use of "non-free" models, the scene setup logic used is the important point here.
The TDW Python API can handle a very broad range of scene setup scenarios, both procedural and explicitly scripted, including parsing custom data formats There are a number of examples of how to set up scenes in our Example Controllers, including deserializing JSON files containing scene setup data. Our [Rube Goldberg Demo](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/audio/rube_goldberg.py) is a good example of this; while the controller does make use of "non-free" models, the scene setup logic used is the important point here.

#### "I need to use custom models."
[We already support this.](https://github.com/threedworld-mit/tdw/blob/master/Documentation/misc_frontend/add_local_object.md)
[We already support this.](https://github.com/threedworld-mit/tdw/blob/master/Documentation/lessons/3d_models/custom_models.md)

#### "I want to add my own custom streamed scene."
We've deliberately restricted the backend pipeline for creating a TDW-compatible scene to the development team because adding scenes is much more complicated than creating a 3D model and requires 3D content-creation tools and experience. If you have a specific requirement for a custom 3D scene, please contact Jeremy Schwartz ([[email protected]](mailto:[email protected])) and we can discuss your particular situation.
Expand Down
2 changes: 1 addition & 1 deletion Documentation/lessons/photorealism/lighting.md
Original file line number Diff line number Diff line change
Expand Up @@ -298,7 +298,7 @@ Example controllers:
- [hdri_skyboxes.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/photorealism/hdri_skyboxes.py) Add different HDRI skyboxes to the same scene.
- [lights_output_data.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/photorealism/lights_output_data.py) Load a streamed scene and received Lights output data.
- [rotate_hdri_skybox.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/photorealism/rotate_hdri_skybox.py) Add an HDRI skybox to the scene and rotate it.
- [shadow_strength.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/photorealism/v.py) Show the difference between shadow strengths.
- [shadow_strength.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/photorealism/shadow_strength.py) Show the difference between shadow strengths.

Python API:

Expand Down
2 changes: 1 addition & 1 deletion Documentation/lessons/physx/forces.md
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,7 @@ Result:
Example controllers:

- [ball_bounce.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/physx/ball_bounce.py) Bounce a ball on a table.
- [collision_and_friction.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/physx/collision_and_friction.py) Collide an object with another with varying physics values.
- [collisions_and_friction.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/physx/collisions_and_friction.py) Collide an object with another with varying physics values.
- [forcefield.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/physx/forcefield.py) Simulate a "forcefield" that objects will bounce off of.

Python API
Expand Down
2 changes: 1 addition & 1 deletion Documentation/lessons/video/audio.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ In setting up your controller:

1. See [install guide](../setup/install.md) for Docker requirements.
2. [Build this container.](https://github.com/threedworld-mit/tdw/blob/master/Docker/Dockerfile_audio)
3. Run [`start_container_audio_video.sh`](https://github.com/threedworld-mit/tdw/blob/master/Docker/start_container_audio.sh). You may need to adjust the `-video_size` and pixel offset (`$DISPLAY+1152,672`) parameters.
3. Run [`start_container_audio_video.sh`](https://github.com/threedworld-mit/tdw/blob/master/Docker/start_container_audio_video.sh). You may need to adjust the `-video_size` and pixel offset (`$DISPLAY+1152,672`) parameters.
4. To stop recording, you will need to stop the Docker container.
5. After recording, you will need to re-encode the video:

Expand Down
2 changes: 1 addition & 1 deletion Documentation/lessons/video/images.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ ffmpeg -f avfoundation -list_devices true -i ""

Example controllers:

- [image_only_video.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/robots/image_only_video.py) Capture image data and automatically call ffmpeg to convert it to a video.
- [image_only_video.py](https://github.com/threedworld-mit/tdw/blob/master/Python/example_controllers/video/image_only_video.py) Capture image data and automatically call ffmpeg to convert it to a video.

Command API:

Expand Down

0 comments on commit 47f1d6c

Please sign in to comment.