Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Node ate obviously incorrect config #2276

Closed
cthulhu-rider opened this issue Mar 14, 2023 · 2 comments
Closed

Node ate obviously incorrect config #2276

cthulhu-rider opened this issue Mar 14, 2023 · 2 comments
Labels
bug Something isn't working config Configuration format update or breaking change I3 Minimal impact neofs-storage Storage node application issues S4 Routine U3 Regular

Comments

@cthulhu-rider
Copy link
Contributor

Context

Node silently started with this config (attached only incorrect part):

storage:
  shard:
    0:
      blobstor:
      - path: /srv/neofs/data0/blobovnicza
        type: blobovnicza
      - path: /srv/neofs/data0
        type: fstree
      metabase:
        path: /srv/neofs/meta/metabase0.db
      pilorama:
        path: /srv/neofs/meta/pilorama0.db
      writecache:
        path: /srv/neofs/meta/write_cache0
    default:
      blobstor:
        blobovnicza:
          depth: 1
          opened_cache_capacity: 32
          size: 4gb
          width: 8
        compress: true
        depth: 4
        perm: '0644'
        small_object_size: 150kb
      gc:
        remover_batch_size: 100
        remover_sleep_interval: 1m
      metabase:
        max_batch_size: 1000
        perm: '0644'
      pilorama:
        max_batch_delay: 10ms
        max_batch_size: 1000
        no_sync: true
      writecache:
        capacity: 64gb
        enabled: true
        max_object_size: 128mb
        memcache_capacity: 2gb
        small_object_size: 128kb
        workers_number: 30
    1:
      blobstor:
      - path: /srv/neofs/data1/blobovnicza
        type: blobovnicza
      - path: /srv/neofs/data1
        type: fstree
      metabase:
        path: /srv/neofs/meta/metabase1.db
      pilorama:
        path: /srv/neofs/meta/pilorama1.db
      writecache:
        path: /srv/neofs/meta/write_cache1
  shard_pool_size: 600
  shard_ro_error_threshold: 30

As we can see storage.shard.default.blobstor section is formed incorrectly according to docs. Also compress and small_object_size must be shard config, not blobstor.

Expected behavior

Node run fails.

Current behavior

Node starts silently.

Possible improvements

  • provide config validator which shares code with neofs-node app (maybe its command?)
  • provide config defaulter which accepts source config and outputs to-be-used config (can be merged with validator)
  • print the configurations used as a result in the application log

Versions

  • neofs-node v0.35.0
@cthulhu-rider cthulhu-rider added bug Something isn't working triage neofs-storage Storage node application issues config Configuration format update or breaking change labels Mar 14, 2023
@roman-khimov roman-khimov added U3 Regular S4 Routine I3 Minimal impact and removed triage labels Dec 21, 2023
@roman-khimov
Copy link
Member

@End-rey, can you try this config with the current node? This should be fixed with #2981.

@End-rey
Copy link
Contributor

End-rey commented Nov 7, 2024

Such errors may appear when running with such a config:

  • unknown field: storage.shard.default.blobstor.blobovnicza
  • unknown field: storage.shard.default.blobstor.compress
  • unknown field: storage.shard.default.writecache.memcache_capacity
  • unknown field: storage.shard.default.blobstor.small_object_size

So I think the issue can be closed.

@roman-khimov roman-khimov closed this as not planned Won't fix, can't repro, duplicate, stale Nov 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working config Configuration format update or breaking change I3 Minimal impact neofs-storage Storage node application issues S4 Routine U3 Regular
Projects
None yet
Development

No branches or pull requests

3 participants