Skip to content

Commit

Permalink
CLI adjustments (#154)
Browse files Browse the repository at this point in the history
  • Loading branch information
Stentonian authored Jan 31, 2024
2 parents 3cd045b + fc09ba8 commit 6f9b874
Show file tree
Hide file tree
Showing 13 changed files with 209 additions and 73 deletions.
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,5 @@ Cargo.lock

# Generated files
**/*.dapoltree
**/*.dapolproof
**/*.dapolproof
inclusion_proofs/
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).

## v0.3.1 (2024-01-20)

- Minor updates to the CLI [PR 154](https://github.com/silversixpence-crypto/dapol/pull/154)

## v0.3.0 (2024-01-20)

- Adjust API to read better using DapolTree instead of Accumulator [36dd58f](https://github.com/silversixpence-crypto/dapol/commit/36dd58fcd9cd2100ac7a1c4a7010faab3397770f). Also included in this change:
Expand Down
2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "dapol"
version = "0.3.0"
version = "0.3.1"
authors = ["Stenton Mayne <[email protected]>"]
edition = "2021"
description = "DAPOL+ Proof of Liabilities protocol"
Expand Down
40 changes: 18 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,29 +40,21 @@ See the [examples](https://github.com/silversixpence-crypto/dapol/examples) dire

### CLI

There is no downloadable executable ([yet](https://github.com/silversixpence-crypto/dapol/issues/110)) so the CLI has to be built from source. You will need to have the rust compiler installed:
Install with cargo:
```bash
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs >> ./rustup-init.sh
./rustup-init.sh -y --no-modify-path
rm -f ./rustup-init.sh
```

For now you must clone the repo to use the CLI. Once you have cloned the repo build everything:
```bash
# run inside the repo
cargo build --release
cargo install dapol
```

You can invoke the CLI like so:
```bash
./target/release/dapol help
dapol help
```

The CLI offers 3 main operations: tree building, proof generation & proof verification. All options can be explored with:
```bash
./target/release/dapol build-tree help
./target/release/dapol gen-proofs help
./target/release/dapol verify-proof help
dapol build-tree help
dapol gen-proofs help
dapol verify-proof help
```

#### Tree building
Expand All @@ -74,44 +66,48 @@ Building a tree can be done:

Build a tree using config file (full log verbosity):
```bash
./target/release/dapol -vvv build-tree config-file ./examples/dapol_config_example.toml
dapol -vvv build-tree config-file ./examples/dapol_config_example.toml
```

Add serialization:
```bash
./target/release/dapol -vvv build-tree config-file ./examples/dapol_config_example.toml --serialize .
dapol -vvv build-tree config-file ./examples/dapol_config_example.toml --serialize .
```

Deserialize a tree from a file:
```bash
./target/release/dapol -vvv build-tree deserialize <file>
dapol -vvv build-tree deserialize <file>
```

Generate proofs (proofs will live in the `./inclusion_proofs/` directory):
```bash
./target/release/dapol -vvv build-tree config-file ./examples/dapol_config_example.toml --gen-proofs ./examples/entities_example.csv
dapol -vvv build-tree config-file ./examples/dapol_config_example.toml --gen-proofs ./examples/entities_example.csv
```

Build a tree using cli args as apposed to a config file:
```bash
# this will generate random secrets & 1000 random entities
./target/release/dapol -vvv build-tree new --accumulator ndm-smt --height 16 --random-entities 1000
# this will generate 1000 random entities
dapol -vvv build-tree new --accumulator ndm-smt --height 16 --random-entities 1000 --secrets-file ./examples/dapol_secrets_example.toml
```

#### Proof generation

As seen above, the proof generation can be done via the tree build command, but it can also be done via its own command, which offers some more options around how the proofs are generated.

```bash
./target/release/dapol -vvv gen-proofs --entity-ids ./examples/entities_example.csv --tree-file <serialized_tree_file>
dapol -vvv gen-proofs --entity-ids ./examples/entities_example.csv --tree-file <serialized_tree_file>
```

```bash
echo "[email protected]" | dapol -vvv gen-proofs --tree-file examples/my_serialized_tree_for_testing.dapoltree --entitiy-ids -
```

The proof generation command only offers 1 way to inject the tree (deserialization), as apposed to the tree build which offers different options.

#### Proof verification

```bash
./target/release/dapol -vvv verify-proof --file-path <inclusion_proof_file> --root-hash <hash>
dapol -vvv verify-proof --file-path <inclusion_proof_file> --root-hash <hash>
```

The root hash is logged out at info level when the tree is built or deserialized.
Expand Down
4 changes: 2 additions & 2 deletions benches/criterion_benches.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ use criterion::{criterion_group, criterion_main};
use criterion::{BenchmarkId, Criterion, SamplingMode};
use statistical::*;

use dapol::{DapolConfigBuilder, DapolTree, InclusionProof, Secret};
use dapol::{DapolConfigBuilder, DapolTree, InclusionProof, Secret, InclusionProofFileType};

mod inputs;
use inputs::{max_thread_counts_greater_than, num_entities_in_range, tree_heights_in_range};
Expand Down Expand Up @@ -292,7 +292,7 @@ pub fn bench_generate_proof<T: Measurement>(c: &mut Criterion<T>) {
std::fs::create_dir_all(dir.clone()).unwrap();
let path = proof
.expect("Proof should be set")
.serialize(entity_id, dir)
.serialize(entity_id, dir, InclusionProofFileType::Binary)
.unwrap();
let file_size = std::fs::metadata(path)
.expect("Unable to get serialized tree metadata for {path}")
Expand Down
6 changes: 3 additions & 3 deletions src/accumulators/ndm_smt.rs
Original file line number Diff line number Diff line change
Expand Up @@ -309,7 +309,7 @@ impl NdmSmt {
.entity_mapping
.get(entity_id)
.and_then(|leaf_x_coord| self.binary_tree.get_leaf_node(*leaf_x_coord))
.ok_or(NdmSmtError::EntityIdNotFound)?;
.ok_or(NdmSmtError::EntityIdNotFound(entity_id.clone()))?;

let path_siblings = PathSiblings::build_using_multi_threaded_algorithm(
&self.binary_tree,
Expand Down Expand Up @@ -394,8 +394,8 @@ pub enum NdmSmtError {
InclusionProofPathSiblingsGenerationError(#[from] crate::binary_tree::PathSiblingsBuildError),
#[error("Inclusion proof generation failed")]
InclusionProofGenerationError(#[from] crate::inclusion_proof::InclusionProofError),
#[error("Entity ID not found in the entity mapping")]
EntityIdNotFound,
#[error("Entity ID {0:?} not found in the entity mapping")]
EntityIdNotFound(EntityId),
#[error("Entity ID {0:?} was duplicated")]
DuplicateEntityIds(EntityId),
}
Expand Down
13 changes: 9 additions & 4 deletions src/cli.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
//! See [MAIN_LONG_ABOUT] for more information.

use clap::{command, Args, Parser, Subcommand};
use clap_verbosity_flag::{Verbosity, WarnLevel};
use clap_verbosity_flag::{InfoLevel, Verbosity};
use patharg::{InputArg, OutputArg};
use primitive_types::H256;

Expand All @@ -12,8 +12,9 @@ use std::str::FromStr;
use crate::{
accumulators::AccumulatorType,
binary_tree::Height,
inclusion_proof,
percentage::{Percentage, ONE_HUNDRED_PERCENT},
MaxLiability, MaxThreadCount, Salt,
InclusionProofFileType, MaxLiability, MaxThreadCount, Salt,
};

// -------------------------------------------------------------------------------------------------
Expand All @@ -30,7 +31,7 @@ pub struct Cli {
pub command: Command,

#[command(flatten)]
pub verbose: Verbosity<WarnLevel>,
pub verbose: Verbosity<InfoLevel>,
}

#[derive(Debug, Subcommand)]
Expand Down Expand Up @@ -87,6 +88,10 @@ pub enum Command {
/// are aggregated using the Bulletproofs protocol.
#[arg(short, long, value_parser = Percentage::from_str, default_value = ONE_HUNDRED_PERCENT, value_name = "PERCENTAGE")]
range_proof_aggregation: Percentage,

/// File type for proofs (supported types: binary, json).
#[arg(short, long, value_parser = InclusionProofFileType::from_str, default_value = InclusionProofFileType::default())]
file_type: inclusion_proof::InclusionProofFileType,
},

/// Verify an inclusion proof.
Expand Down Expand Up @@ -144,7 +149,7 @@ pub enum BuildKindCommand {
max_thread_count: MaxThreadCount,

#[arg(short, long, value_name = "FILE_PATH", long_help = SECRETS_HELP)]
secrets_file: Option<InputArg>,
secrets_file: InputArg,

#[command(flatten)]
entity_source: EntitySource,
Expand Down
4 changes: 2 additions & 2 deletions src/entity.rs
Original file line number Diff line number Diff line change
Expand Up @@ -40,13 +40,13 @@ pub const ENTITY_ID_MAX_BYTES: usize = 32;
pub struct EntityId(String);

impl FromStr for EntityId {
type Err = EntitiesParserError;
type Err = EntityIdsParserError;

/// Constructor that takes in a string slice.
/// If the length of the str is greater than the max then Err is returned.
fn from_str(s: &str) -> Result<Self, Self::Err> {
if s.len() > ENTITY_ID_MAX_BYTES {
Err(EntitiesParserError::EntityIdTooLongError { id: s.into() })
Err(Self::Err::EntityIdTooLongError { id: s.into() })
} else {
Ok(EntityId(s.into()))
}
Expand Down
4 changes: 0 additions & 4 deletions src/entity/entities_parser.rs
Original file line number Diff line number Diff line change
Expand Up @@ -170,10 +170,6 @@ pub enum EntitiesParserError {
UnsupportedFileType { ext: String },
#[error("Error opening or reading CSV file")]
CsvError(#[from] csv::Error),
#[error(
"The given entity ID ({id:?}) is longer than the max allowed {ENTITY_ID_MAX_BYTES} bytes"
)]
EntityIdTooLongError { id: String },
}

// -------------------------------------------------------------------------------------------------
Expand Down
81 changes: 65 additions & 16 deletions src/entity/entity_ids_parser.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
use std::str::FromStr;
use std::{ffi::OsString, path::PathBuf};

use log::debug;
use log::{debug, info};

use crate::entity::{EntityId, ENTITY_ID_MAX_BYTES};

Expand All @@ -23,6 +23,7 @@ use crate::entity::{EntityId, ENTITY_ID_MAX_BYTES};
/// ```
pub struct EntityIdsParser {
path: Option<PathBuf>,
entity_ids_list: Option<String>,
}

/// Supported file types for the parser.
Expand All @@ -31,21 +32,55 @@ enum FileType {
}

impl EntityIdsParser {
/// Open and parse the file, returning a vector of entity IDs.
/// The file is expected to hold 1 or more entity records.
/// Parse the input.
///
/// An error is returned if:
/// a) the file cannot be opened
/// b) the file type is not supported
/// c) deserialization of any of the records in the file fails
/// If the `path` field is set then:
/// - Open and parse the file, returning a vector of entity IDs.
/// - The file is expected to hold 1 or more entity records.
/// - An error is returned if:
/// a) the file cannot be opened
/// b) the file type is not supported
/// c) deserialization of any of the records in the file fails
///
/// If `path` is not set and `entity_ids_list` is then:
/// - Parse the value as a string list using `serde_json`
/// - An error is returned if:
/// a) deserialization using `serde_json` fails
///
/// If neither are set then an error is returned.
pub fn parse(self) -> Result<Vec<EntityId>, EntityIdsParserError> {
if let Some(path) = self.path {
EntityIdsParser::parse_csv(path)
} else if let Some(entity_ids_list) = self.entity_ids_list {
EntityIdsParser::parse_list(entity_ids_list)
} else {
Err(EntityIdsParserError::NeitherPathNorListSet)
}
}

fn parse_list(mut entity_ids_list: String) -> Result<Vec<EntityId>, EntityIdsParserError> {
// Remove trailing newline if it exists.
if entity_ids_list.chars().nth_back(0).map_or(false, |c| c == '\n') {
entity_ids_list.pop();
}

// Assume the input is a comma-separated list.
let parts = entity_ids_list.split(',');

let mut entity_ids = Vec::<EntityId>::new();
for part in parts {
entity_ids.push(EntityId::from_str(&part)?)
}

Ok(entity_ids)
}

fn parse_csv(path: PathBuf) -> Result<Vec<EntityId>, EntityIdsParserError> {
debug!(
"Attempting to parse {:?} as a file containing a list of entity IDs",
&self.path
&path
);

let path = self.path.ok_or(EntityIdsParserError::PathNotSet)?;

let ext = path.extension().and_then(|s| s.to_str()).ok_or(
EntityIdsParserError::UnknownFileType(path.clone().into_os_string()),
)?;
Expand All @@ -71,14 +106,28 @@ impl EntityIdsParser {

impl From<PathBuf> for EntityIdsParser {
fn from(path: PathBuf) -> Self {
Self { path: Some(path) }
Self {
path: Some(path),
entity_ids_list: None,
}
}
}

impl FromStr for EntityIdsParser {
type Err = EntityIdsParserError;

fn from_str(value: &str) -> Result<Self, Self::Err> {
Ok(Self {
path: None,
entity_ids_list: Some(value.to_string()),
})
}
}

impl FromStr for FileType {
type Err = EntityIdsParserError;

fn from_str(ext: &str) -> Result<FileType, Self::Err> {
fn from_str(ext: &str) -> Result<Self, Self::Err> {
match ext {
"csv" => Ok(FileType::Csv),
_ => Err(EntityIdsParserError::UnsupportedFileType { ext: ext.into() }),
Expand All @@ -92,16 +141,16 @@ impl FromStr for FileType {
/// Errors encountered when handling [EntityIdsParser].
#[derive(thiserror::Error, Debug)]
pub enum EntityIdsParserError {
#[error("Expected path to be set but found none")]
PathNotSet,
#[error("Expected num_entities to be set but found none")]
NumEntitiesNotSet,
#[error("Either path or entity_id_list must be set")]
NeitherPathNorListSet,
#[error("Unable to find file extension for path {0:?}")]
UnknownFileType(OsString),
#[error("The file type with extension {ext:?} is not supported")]
UnsupportedFileType { ext: String },
#[error("Error opening or reading CSV file")]
CsvError(#[from] csv::Error),
#[error("Problem serializing/deserializing with serde_json")]
JsonSerdeError(#[from] serde_json::Error),
#[error(
"The given entity ID ({id:?}) is longer than the max allowed {ENTITY_ID_MAX_BYTES} bytes"
)]
Expand Down
Loading

0 comments on commit 6f9b874

Please sign in to comment.