Skip to content

Commit

Permalink
* Upgrade presets for Triton Inference Server 2.38.0 (pull #1420)
Browse files Browse the repository at this point in the history
  • Loading branch information
jbkyang-nvi authored Oct 13, 2023
1 parent c2d0a7d commit 275be0b
Show file tree
Hide file tree
Showing 5 changed files with 12 additions and 12 deletions.
2 changes: 1 addition & 1 deletion .github/actions/deploy-ubuntu/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ runs:
# $SUDO apt-key adv --keyserver keyserver.ubuntu.com --recv-keys BA6932366A755776
$SUDO apt-get update
$SUDO apt-get -y install gcc-multilib g++-multilib gfortran-multilib python3 python2.7 python3-minimal python2.7-minimal rpm libasound2-dev:$ARCH freeglut3-dev:$ARCH libfontconfig-dev:$ARCH libgtk2.0-dev:$ARCH libusb-dev:$ARCH libusb-1.0-0-dev:$ARCH libffi-dev:$ARCH libbz2-dev:$ARCH zlib1g-dev:$ARCH libxcb1-dev:$ARCH
$SUDO apt-get -y install pkg-config ccache clang $TOOLCHAIN openjdk-8-jdk ant python2 python3-pip swig git file wget unzip tar bzip2 gzip patch autoconf-archive autogen automake make libtool bison flex perl nasm ragel curl libcurl4-openssl-dev libssl-dev libffi-dev libbz2-dev zlib1g-dev rapidjson-dev
$SUDO apt-get -y install pkg-config ccache clang $TOOLCHAIN openjdk-8-jdk ant python2 python3-pip swig git file wget unzip tar bzip2 gzip patch autoconf-archive autogen automake cmake make libtool bison flex perl nasm ragel curl libcurl4-openssl-dev libssl-dev libffi-dev libbz2-dev zlib1g-dev rapidjson-dev
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64/
echo "JAVA_HOME=$JAVA_HOME" >> $GITHUB_ENV
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tritonserver.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,6 @@ env:
jobs:
linux-x86_64:
runs-on: ubuntu-20.04
container: nvcr.io/nvidia/tritonserver:23.05-py3
container: nvcr.io/nvidia/tritonserver:23.09-py3
steps:
- uses: bytedeco/javacpp-presets/.github/actions/deploy-ubuntu@actions
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
* Refactor and improve presets for PyTorch ([pull #1360](https://github.com/bytedeco/javacpp-presets/pull/1360))
* Include `mkl_lapack.h` header file in presets for MKL ([issue #1388](https://github.com/bytedeco/javacpp-presets/issues/1388))
* Map new higher-level C++ API of Triton Inference Server ([pull #1361](https://github.com/bytedeco/javacpp-presets/pull/1361))
* Upgrade presets for OpenCV 4.8.0, DNNL 3.2.1, OpenBLAS 0.3.24, CPython 3.11.5, NumPy 1.25.2, SciPy 1.11.2, LLVM 17.0.1, TensorFlow Lite 2.14.0, Triton Inference Server 2.34.0, ONNX 1.14.1, ONNX Runtime 1.16.0, TVM 0.13.0, and their dependencies
* Upgrade presets for OpenCV 4.8.0, DNNL 3.2.1, OpenBLAS 0.3.24, CPython 3.11.5, NumPy 1.25.2, SciPy 1.11.2, LLVM 17.0.1, TensorFlow Lite 2.14.0, Triton Inference Server 2.38.0, ONNX 1.14.1, ONNX Runtime 1.16.0, TVM 0.13.0, and their dependencies

### June 6, 2023 version 1.5.9
* Virtualize `nvinfer1::IGpuAllocator` from TensorRT to allow customization ([pull #1367](https://github.com/bytedeco/javacpp-presets/pull/1367))
Expand Down
10 changes: 5 additions & 5 deletions tritonserver/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Introduction
------------
This directory contains the JavaCPP Presets module for:

* Triton Inference Server 2.34.0 https://github.com/triton-inference-server/server
* Triton Inference Server 2.38.0 https://github.com/triton-inference-server/server

Please refer to the parent README.md file for more detailed information about the JavaCPP Presets.

Expand Down Expand Up @@ -51,17 +51,17 @@ This sample intends to show how to call the Java-mapped C API of Triton to execu

1. Get the source code of Triton Inference Server to prepare the model repository:
```bash
$ wget https://github.com/triton-inference-server/server/archive/refs/tags/v2.34.0.tar.gz
$ tar zxvf v2.34.0.tar.gz
$ cd server-2.34.0/docs/examples/model_repository
$ wget https://github.com/triton-inference-server/server/archive/refs/tags/v2.38.0.tar.gz
$ tar zxvf v2.38.0.tar.gz
$ cd server-2.38.0/docs/examples/model_repository
$ mkdir models
$ cd models; cp -a ../simple .
```
Now, this `models` directory will be our model repository.

2. Start the Docker container to run the sample (assuming we are under the `models` directory created above):
```bash
$ docker run -it --gpus=all -v $(pwd):/workspace nvcr.io/nvidia/tritonserver:23.05-py3 bash
$ docker run -it --gpus=all -v $(pwd):/workspace nvcr.io/nvidia/tritonserver:23.09-py3 bash
$ apt update
$ apt install -y openjdk-11-jdk
$ wget https://archive.apache.org/dist/maven/maven-3/3.8.4/binaries/apache-maven-3.8.4-bin.tar.gz
Expand Down
8 changes: 4 additions & 4 deletions tritonserver/cppbuild.sh
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,13 @@ if [[ -z "$PLATFORM" ]]; then
exit
fi

INCLUDE_DEVELOPER_TOOLS_SERVER=${INCLUDE_DEVELOPER_TOOLS_SERVER:=0}
INCLUDE_DEVELOPER_TOOLS_SERVER=${INCLUDE_DEVELOPER_TOOLS_SERVER:=1}

if [[ ! -f "/opt/tritonserver/include/triton/developer_tools/generic_server_wrapper.h" ]] && [[ ! -f "/opt/tritonserver/lib/libtritondevelopertoolsserver.so" ]] && [[ ${INCLUDE_DEVELOPER_TOOLS_SERVER} -eq 0 ]]; then
if [[ ! -f "/opt/tritonserver/include/triton/developer_tools/generic_server_wrapper.h" ]] && [[ ! -f "/opt/tritonserver/lib/libtritondevelopertoolsserver.so" ]] && [[ ${INCLUDE_DEVELOPER_TOOLS_SERVER} -ne 0 ]]; then
TOOLS_BRANCH=${TOOLS_BRANCH:="https://github.com/triton-inference-server/developer_tools.git"}
TOOLS_BRANCH_TAG=${TOOLS_BRANCH_TAG:="main"}
TOOLS_BRANCH_TAG=${TOOLS_BRANCH_TAG:="r23.09"}
TRITON_CORE_REPO=${TRITON_CORE_REPO:="https://github.com/triton-inference-server/core.git"}
TRITON_CORE_REPO_TAG=${TRITON_CORE_REPO_TAG="r23.05"}
TRITON_CORE_REPO_TAG=${TRITON_CORE_REPO_TAG="r23.09"}
TRITON_HOME="/opt/tritonserver"
BUILD_HOME="$PWD"/tritonbuild
mkdir -p ${BUILD_HOME} && cd ${BUILD_HOME}
Expand Down

0 comments on commit 275be0b

Please sign in to comment.