Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dynamic changing of log level throws warnings. #386

Open
dforste opened this issue Feb 24, 2020 · 1 comment
Open

Dynamic changing of log level throws warnings. #386

dforste opened this issue Feb 24, 2020 · 1 comment

Comments

@dforste
Copy link

dforste commented Feb 24, 2020

Please post all product and debugging questions on our forum. Your questions will reach our wider community members there, and if we confirm that there is a bug, then we can open a new issue here.

For all general issues, please provide the following details for fast resolution:

  • Version: logstash-input-beats in both 6.0.5 and 6.0.8 In logstash 7.5.2
  • Operating System: Docker deployment
  • Config File (if you have sensitive info, please remove it):
input {
  beats {
    port => 5044
  }
}
  • Sample Data:
    Not relevant
  • Steps to Reproduce:
docker run -d --name logstash docker.elastic.co/logstash/logstash:7.5.2 
docker exec -it logstash curl -XPUT localhost:9600/_node/logging?pretty -H 'Content-Type: application/json' -d'
{
    "logger.logstash.inputs.beats" : "DEBUG"
}
'

The following logs are emitted when you execute that.

/usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: wrong element type NilClass at 0 (expected array)
/usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: ignoring wrong elements is deprecated, remove them explicitly
/usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: this causes ArgumentError in the next release

No additional logs are emitted after changing the value. The value does seem to change in the api though.

@asolimando
Copy link

asolimando commented Dec 23, 2020

Same exact error after trying to change the logging level via the LoggingAPI:

curl -XPUT '$myhostport/_node/logging?pretty' -H 'Content-Type: application/json' -d'
{
    "logger.logstash.inputs.kafka" : "DEBUG"
}
'
{
  "host" : "$myhost",
  "version" : "7.1.1",
  "http_address" : "0.0.0.0:9600",
  "id" : "a168aa05-0e1b-490f-afc5-eaadebeb8500",
  "name" : "$myname",
  "acknowledged" : true
}%

In the logs I have:

2020-12-23T08:07:57.953882997Z /usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: wrong element type NilClass at 0 (expected array)
2020-12-23T08:07:57.954203734Z /usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: ignoring wrong elements is deprecated, remove them explicitly
2020-12-23T08:07:57.954528334Z /usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: this causes ArgumentError in the next release
2020-12-23T08:11:29.590014904Z /usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: wrong element type NilClass at 0 (expected array)
2020-12-23T08:11:29.590241219Z /usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: ignoring wrong elements is deprecated, remove them explicitly
2020-12-23T08:11:29.590331895Z /usr/share/logstash/logstash-core/lib/logstash/api/modules/logging.rb:14: warning: this causes ArgumentError in the next release

However, all seems good displaying the the logging levels (DEBUG is correctly set for Kafka input plugin):

{
  "host" : "$myhost",
  "version" : "7.1.1",
  "http_address" : "0.0.0.0:9600",
  "id" : "a168aa05-0e1b-490f-afc5-eaadebeb8500",
  "name" : "$myname",
  "loggers" : {
    "logstash.agent" : "INFO",
    "logstash.api.service" : "INFO",
    "logstash.codecs.json" : "INFO",
    "logstash.config.source.local.configpathloader" : "INFO",
    "logstash.config.source.multilocal" : "INFO",
    "logstash.config.sourceloader" : "INFO",
    "logstash.filters.csv" : "INFO",
    "logstash.filters.date" : "INFO",
    "logstash.filters.drop" : "INFO",
    "logstash.filters.mutate" : "INFO",
    "logstash.filters.ruby" : "INFO",
    "logstash.inputs.kafka" : "DEBUG",
    "logstash.instrument.periodicpoller.deadletterqueue" : "INFO",
    "logstash.instrument.periodicpoller.jvm" : "INFO",
    "logstash.instrument.periodicpoller.os" : "INFO",
    "logstash.instrument.periodicpoller.persistentqueue" : "INFO",
    "logstash.javapipeline" : "INFO",
    "logstash.modules.scaffold" : "INFO",
    "logstash.outputs.kafka" : "INFO",
    "logstash.plugins.registry" : "INFO",
    "logstash.runner" : "INFO",
    "logstash.setting.writabledirectory" : "INFO",
    "org.apache.kafka.clients.ClientUtils" : "INFO",
    "org.apache.kafka.clients.CommonClientConfigs" : "INFO",
    "org.apache.kafka.clients.Metadata" : "INFO",
    "org.apache.kafka.clients.NetworkClient" : "INFO",
    "org.apache.kafka.clients.consumer.ConsumerConfig" : "INFO",
    "org.apache.kafka.clients.consumer.KafkaConsumer" : "INFO",
    "org.apache.kafka.clients.consumer.internals.AbstractCoordinator" : "INFO",
    "org.apache.kafka.clients.consumer.internals.AbstractPartitionAssignor" : "INFO",
    "org.apache.kafka.clients.consumer.internals.ConsumerCoordinator" : "INFO",
    "org.apache.kafka.clients.consumer.internals.ConsumerInterceptors" : "INFO",
    "org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient" : "INFO",
    "org.apache.kafka.clients.consumer.internals.Fetcher" : "INFO",
    "org.apache.kafka.clients.producer.KafkaProducer" : "INFO",
    "org.apache.kafka.clients.producer.ProducerConfig" : "INFO",
    "org.apache.kafka.clients.producer.internals.ProducerInterceptors" : "INFO",
    "org.apache.kafka.clients.producer.internals.RecordAccumulator" : "INFO",
    "org.apache.kafka.clients.producer.internals.Sender" : "INFO",
    "org.apache.kafka.common.metrics.JmxReporter" : "INFO",
    "org.apache.kafka.common.metrics.Metrics" : "INFO",
    "org.apache.kafka.common.network.NetworkReceive" : "INFO",
    "org.apache.kafka.common.network.PlaintextChannelBuilder" : "INFO",
    "org.apache.kafka.common.network.Selector" : "INFO",
    "org.apache.kafka.common.protocol.Errors" : "INFO",
    "org.apache.kafka.common.requests.DeleteAclsResponse" : "INFO",
    "org.apache.kafka.common.utils.AppInfoParser" : "INFO",
    "org.apache.kafka.common.utils.KafkaThread" : "INFO",
    "org.apache.kafka.common.utils.Utils" : "INFO",
    "org.logstash.Logstash" : "INFO",
    "org.logstash.config.ir.CompiledPipeline" : "INFO",
    "org.logstash.execution.AbstractPipelineExt" : "INFO",
    "org.logstash.execution.JavaBasePipelineExt" : "INFO",
    "org.logstash.execution.PeriodicFlush" : "INFO",
    "org.logstash.execution.ShutdownWatcherExt" : "INFO",
    "org.logstash.execution.WorkerLoop" : "INFO",
    "org.logstash.filters.DateFilter" : "INFO",
    "org.logstash.instrument.metrics.gauge.LazyDelegatingGauge" : "INFO",
    "org.logstash.plugins.pipeline.PipelineBus" : "INFO",
    "org.logstash.secret.store.SecretStoreFactory" : "INFO",
    "slowlog.logstash.codecs.json" : "INFO",
    "slowlog.logstash.filters.csv" : "INFO",
    "slowlog.logstash.filters.date" : "INFO",
    "slowlog.logstash.filters.drop" : "INFO",
    "slowlog.logstash.filters.mutate" : "INFO",
    "slowlog.logstash.filters.ruby" : "INFO",
    "slowlog.logstash.inputs.kafka" : "INFO",
    "slowlog.logstash.outputs.kafka" : "INFO"
  }
}%

It does not recover even with a reset of the log properties via:

curl -XPUT '$myhostport/_node/logging/reset?pretty'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants