Skip to content

Commit

Permalink
Merge pull request #4 from berylliumsec/fixing_args
Browse files Browse the repository at this point in the history
fixing args and updating readme
  • Loading branch information
berylliumsec-handler authored Mar 7, 2024
2 parents ccb236c + 8bc97e9 commit e62f034
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 13 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,9 +114,9 @@ pip install eclipse-ai --upgrade
## Usage.

``` bash
usage: eclipse [-h] [-p PROMPT] [-f FILE] [-m MODEL_PATH] [-o OUTPUT] [--debug] [-d DELIMITER] [-g] [-dir MODEL_DIRECTORY] [--line_by_line]
usage: eclipse.py [-h] [-p PROMPT] [-f FILE] [-m MODEL_PATH] [-o OUTPUT] [--debug] [-d DELIMITER] [-g] [--line_by_line] [-c CONFIDENCE_THRESHOLD]

Entity recognition using BERT.
Sensitive Information Detector.

options:
-h, --help show this help message and exit
Expand All @@ -131,9 +131,9 @@ options:
-d DELIMITER, --delimiter DELIMITER
Delimiter to separate text inputs, defaults to newline.
-g, --use_gpu Enable GPU usage for model inference.
-dir MODEL_DIRECTORY, --model_directory MODEL_DIRECTORY
Directory where the BERT model should be downloaded and unzipped.
--line_by_line Process text line by line and yield results incrementally.
-c CONFIDENCE_THRESHOLD, --confidence_threshold CONFIDENCE_THRESHOLD
Confidence threshold for considering predictions as high confidence.
```

Here are some examples:
Expand Down
12 changes: 3 additions & 9 deletions src/eclipse/eclipse.py
Original file line number Diff line number Diff line change
Expand Up @@ -422,7 +422,7 @@ def process_single_line(


def main():
parser = argparse.ArgumentParser(description="Entity recognition using BERT.")
parser = argparse.ArgumentParser(description="Sensitive Information Detector.")
parser.add_argument(
"-p", "--prompt", type=str, help="Direct text prompt for recognizing entities."
)
Expand Down Expand Up @@ -461,13 +461,7 @@ def main():
action="store_true",
help="Enable GPU usage for model inference.",
)
parser.add_argument(
"-dir",
"--model_directory",
type=str,
default=DEFAULT_MODEL_PATH,
help="Directory where the BERT model should be downloaded and unzipped.",
)

parser.add_argument(
"--line_by_line",
action="store_true",
Expand All @@ -490,7 +484,7 @@ def main():

# Now we ensure the model folder exists if needed
if args.prompt or args.file:
ensure_model_folder_exists(args.model_directory, auto_update=True)
ensure_model_folder_exists(args.model_path, auto_update=True)

# Determine whether to use the GPU or not based on the user's command line input
device = "cuda" if args.use_gpu and torch.cuda.is_available() else "cpu"
Expand Down

0 comments on commit e62f034

Please sign in to comment.