$ git clone [email protected]:gallery-so/go-gallery.git
$ cd go-gallery
$ go get -u=patch -d ./...
$ go build -o ./bin/main ./cmd/server/main.go
This will generate a binary within ./bin/main
. To run the binary, simply run:
$ ./bin/main
The app will connect to a local redis and local postgres instance by default. To spin it up, you can use the docker commands below.
[Optional] Shell script to seed NFT data
If you want to seed your local database with real, indexed data from our dev or production clusters, you can "prep" your environment using the following bash script. Running this won't execute the import itself, but rather trigger the import when you run the later docker commands. As a pre-requisite, you must have access to _encrypted_deploy
in order to access the dev / prod clusters.
Note that if you run the following command, don't run make g-docker
and upload the image to Dockerhub. This will expose the locally migrated data to the public. You can avoid this by opening a
new shell. More on make g-docker
further below.
Finally: if you are using bash/sh instead of zsh, change the first line of the _import_env.sh
file to match your shell.
$ source ./_import_env.sh <path to dev/prod backend app.yaml> <address of dev/prod wallet to import data>
Docker commands
Build the docker containers. If you ran the above shell script, the seed script will be executed. You can re-run this script in the future if you want the latest data:
$ make docker-build
Run the docker containers:
$ make docker-start
To remove running redis and postgres instance:
$ make docker-stop
Working with migrations
The migrate
cli can be installed via brew (assuming MacOS):
brew install golang-migrate
Create a new migration:
# New migration for the backend db
migrate create -ext sql -dir db/migrations/core -seq <name of migration>
Run a migration locally:
# Run all migrations for the local backend db
make migrate-coredb
Run a migration on dev backend db:
# Apply an up migration to the backend db
migrate -path db/migrations/core -database "postgresql://postgres:<dev db password here>@34.102.59.201:5432/postgres" up
# Undo the last migration to the backend db
migrate -path db/migrations/core -database "postgresql://postgres:<dev db password here>@34.102.59.201:5432/postgres" down 1
Verify that the server is running:
$ curl localhost:4000/alive
This is available for live environments:
$ curl api.gallery.so/alive
## Testing
Run all tests in current directory and all of its subdirectories:
```bash
$ go test ./...
Run all tests in subdirectory (e.g. /server):
$ go test ./server/...
Run a specific test by passing including the -run
flag. The example will run GraphQL tests under the TestMain
suite that start with "should get trending".
go test -run=TestMain/"test GraphQL"/"should get trending" ./graphql
Add -v
for detailed logs.
Skip longer running tests with the -short
flag:
go test -short
If you have access to the _encrypted_local
file, you can run the server locally with live data. This is useful for testing the server locally with real data.
For example, to run the server locally with live data from the dev
environment, run the following command:
go run cmd/server/main.go dev
# Do not switch the order of the buckets! Doing so may overwrite prod data.
gsutil -m rsync -r gs://prod-eth-token-logs gs://dev-eth-token-logs
These are the added certificates that are included in the _deploy
folder. They are used to verify the SSL certificates of
various HTTPS endpoints.
sectigo.crt
- Sectigo RSA Domain Validation Secure Server CA for example vibes.art