Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Validator Score #21

Open
wants to merge 16 commits into
base: main
Choose a base branch
from
Open

Validator Score #21

wants to merge 16 commits into from

Conversation

onmax
Copy link
Member

@onmax onmax commented May 28, 2024

Preview

TODOs
  • add internal links to learn sections like "inherents" or epochs...
  • disucss where to put the link in the sidebar
  • Rename the file name from validataor-trust-score to validator-score
  • Add young validators issue:
    How soon after a validator starts can we begin scoring them? What is the minimum time required to provide a reliable score?
    After he completed one epoch. After this, the liveness score will be close to 0. and it will quickly improve as it is not linear. I will add more info about this specific case

Copy link

netlify bot commented May 28, 2024

Deploy Preview for nimiq-developer-center failed. Why did it fail? →

Name Link
🔨 Latest commit 17c9182
🔍 Latest deploy log https://app.netlify.com/sites/nimiq-developer-center/deploys/66d03f602f13b60008701912

Copy link

netlify bot commented May 28, 2024

Deploy Preview for developer-center failed. Why did it fail? →

Name Link
🔨 Latest commit 17c9182
🔍 Latest deploy log https://app.netlify.com/sites/developer-center/deploys/66d03f607fbf9500086ca480

learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
@onmax onmax changed the title [WIP] Validator Trust Score [Validator Trust Score May 31, 2024
@onmax onmax changed the title [Validator Trust Score Validator Trust Score May 31, 2024
@onmax onmax marked this pull request as ready for review May 31, 2024 08:45
learn/validator-trust-score.md Outdated Show resolved Hide resolved
$$

$$
\text{window\_duration\_ms} = 9 \times 30 \times 24 \times 60 \times 60 \times 1000
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Myabe explain what each number refers to?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I need other opinions?

People, please vote: Yay 👍 or Nay 👎 ?

learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
C_i = \sum_{j=0}^{N-1} c_j \quad \text{for } i = 0, 1, 2, \ldots, m-1
$$

$c_j$ is the number of blocks that the validator produced in the batch $j$, where $j \in [0, N-1]$.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could emphasize that is not only blocks produced but correctly produced or rewarded for

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

$C_i$ rewarded blocks (and thus produced) by a validator in the epoch $i$:

Let me knwo if you like

learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved
learn/validator-trust-score.md Outdated Show resolved Hide resolved

If a validator is not active producing blocks, it could still have a high Size and Reliability score. This would be misleading because they are not contributing to the operation of the network. The Liveness factor ensures that only active validators that are actually selected to produce blocks receive a higher score. We want to penalize validators that are not selected to produce blocks due of being inactive, jailed, offline, etc.

We use the term _liveness_ instead of _uptime_ because _uptime_ suggests precision, as in server contexts where the exact online time can be measured. In our context, there is no way to measure how long a validator has been online. We can only know when the validators have been active and producing blocks, but there is no way of knowing when they are active but not producing blocks, or when they are offline. In summary, to avoid confusion, we use _liveness_ to represent how often a validator is actively selected to produce blocks.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
We use the term _liveness_ instead of _uptime_ because _uptime_ suggests precision, as in server contexts where the exact online time can be measured. In our context, there is no way to measure how long a validator has been online. We can only know when the validators have been active and producing blocks, but there is no way of knowing when they are active but not producing blocks, or when they are offline. In summary, to avoid confusion, we use _liveness_ to represent how often a validator is actively selected to produce blocks.
We use the term _liveness_ instead of _uptime_ because _uptime_ suggests precision, as in server contexts where the exact online time can be measured. In our context, there is no way to measure how long a validator has been online. We can only know when the validators have been active and producing blocks, but there is no way of knowing if they are active and not producing blocks or offline. In summary, to avoid confusion, we use _liveness_ to represent how often a validator is actively selected to produce blocks.

I want to make clear that a validator can be active and offline at the same time.
A validator can be offline and not produce blocks for being offline, or active but not producing blocks because it was not selected in a specific epoch

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about this?


We use the term liveness instead of uptime because uptime implies precise measurement, as in server contexts where you can measure online time. In our case, we can't measure how long a validator has been online. We can only see when validators are active and producing blocks. There's no way of telling when they're active but not producing blocks, or when they're offline.

To be clear, a validator can be active and offline at the same time. It could be offline and not producing blocks because it's offline, or it could be active but not producing blocks because it hasn't been selected in a certain period. This is why we use liveness to show how often a validator is selected to produce blocks.

@nibhar
Copy link
Member

nibhar commented May 31, 2024

As a higher level comment after an initial quick read it seems to me that Size and Liveliness as they currently are calculated are inversely proportional to one another. That is probably not ideal as they sort of measure the same thing (bigger size gets you selected more often), but the score adjustments oppose each other.

An option would be to move Liveliness from purely being selected as the criteria, to not being selected due to a different reason than randomness (being jailed, being deactivated, etc). That is also implied by the paragraph leading up to it. This too must be done carefully as Reliability is also already a factor and we likely do not want to measure that one again either.

Another would be to scratch Liveliness if it does not add something new and its function can be absorbed into either Reliability or Size

also removed "trust" and now is only "trust score"
@Eligioo
Copy link
Member

Eligioo commented Aug 29, 2024

Validator Trust Score has been renamed to Validator Score?

@onmax
Copy link
Member Author

onmax commented Aug 30, 2024

Validator Trust Score has been renamed to Validator Score?

After a few conversations with Soeren, I thought it might not be such a good idea to add the adjective "trust". For now this is not a final decision, I will bring this discussion in the following days after Alberto has checked and verified that the code and the latest commit in this PR are ok.

@onmax onmax changed the title Validator Trust Score Validator Score Sep 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants