Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

writing HI configuration file 2024 #4988

Open
wants to merge 12 commits into
base: master
Choose a base branch
from
Open

writing HI configuration file 2024 #4988

wants to merge 12 commits into from

Conversation

LinaresToine
Copy link
Contributor

HI datataking is coming up and I attempt to create the HIReplayConfiguration.py file.

Questions remaining:

  • Global tags
  • Scenarios
  • HLTScout testing configuration
  • PDs configuration

@germanfgv
Copy link
Contributor

It seems you are replacing HI replay config with the regular replay config. Please review the changes, as there are many parameters that are different from HI to pp configs.

@LinaresToine
Copy link
Contributor Author

Yes, I originally created the file based on current replay config. I now based it on last year's HI config.

I believe the file is ready to run a replay, the only thing we miss for now are 141X GTs.

expressGlobalTag = "132X_dataRun3_Express_v4"
promptrecoGlobalTag = "132X_dataRun3_Prompt_v4"
expressGlobalTag = "141X_dataRun3_Express_v2"
promptrecoGlobalTag = "141X_dataRun3_Prompt_v2"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
promptrecoGlobalTag = "141X_dataRun3_Prompt_v2"
promptrecoGlobalTag = "141X_dataRun3_Prompt_v3"

141X_dataRun3_Prompt_v3 has DeDxCalibrationRcd, see cms-sw/cmssw#46186

@@ -1337,7 +1373,7 @@
write_dqm=True,
alca_producers=["TkAlMinBias"],
dqm_sequences=["@common"],
scenario=hiTestppScenario)
scenario=hiLightEventsScenario)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LinaresToine I don't understand this choice (and also several others).
In 2023 this PD was processed with ppEra_Run3_pp_on_PbPb_2023. I think this should be hiHeavyEventsScenario (= "ppEra_Run3_pp_on_PbPb_2024") now.

@@ -1346,7 +1382,7 @@
do_reco=False,
aod_to_disk=False,
raw_to_disk=False,
scenario=hiTestppScenario)
scenario=hiLightEventsScenario)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LinaresToine I don't understand this choice (and also several others).
In 2023 this PD was processed with ppEra_Run3_pp_on_PbPb_2023. I think this should be hiHeavyEventsScenario (= "ppEra_Run3_pp_on_PbPb_2024") now.

@@ -1357,7 +1393,7 @@
raw_to_disk=False,
aod_to_disk=False,
dqm_sequences=["@common"],
scenario=hiTestppScenario)
scenario=hiLightEventsScenario)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LinaresToine I don't understand this choice (and also several others).
In 2023 this PD was processed with ppEra_Run3_pp_on_PbPb_2023. I think this should be hiHeavyEventsScenario (= "ppEra_Run3_pp_on_PbPb_2024") now.

@@ -1433,7 +1469,7 @@
timePerEvent=1,
disk_node="T2_US_Vanderbilt",
dqm_sequences=["@commonSiStripZeroBias"],
scenario=hiTestppScenario)
scenario=hiLightEventsScenario)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LinaresToine I don't understand this choice (and also several others).
In 2023 this PD was processed with ppEra_Run3_pp_on_PbPb_2023. I think this should be hiHeavyEventsScenario (= "ppEra_Run3_pp_on_PbPb_2024") now.

@LinaresToine
Copy link
Contributor Author

Thank you @mmusich . I originally understood that RawPrime goes with ppEra_Run3_pp_on_PbPb_approxSiStripClusters_2024, while everything else goes with UPC, except for express, which use ppEra_Run3_pp_on_PbPb_2024. I'll retry with:

RawPrime : ppEra_Run3_pp_on_PbPb_approxSiStripClusters_2024 --> hiRawPrimeScenario
Forward: ppEra_Run3_2024_UPC --> hiForwardScenario
Everything else: ppEra_Run3_pp_on_PbPb_2024 --> hiScenario

@fabiocos
Copy link
Contributor

fabiocos commented Oct 8, 2024

@LinaresToine we should try a new replay test of this configuration using CMSSW_14_1_1, that should contain all the latest CMSSW updates discussed as a follow-up of the previous failure, in preparation for the forthcoming HI data taking (regular pp data taking is ending in one week from now)

@LinaresToine
Copy link
Contributor Author

LinaresToine commented Oct 8, 2024

@fabiocos the replay with CMSSW_14_1_1 has ended, I did not update the PR, apologies. We have an error with DeDxEstimator module. I will create a GH issue for this

@LinaresToine
Copy link
Contributor Author

For the record, previous issues with cmsssw were solved in this issue:

@fabiocos
Copy link
Contributor

https://github.com/cms-sw/cmssw/releases/tag/CMSSW_14_1_2 is now available with the known needed fixes. I suggest to update the branch above, and restart a replay before the week-end, so as we may have at the Joint Ops on Monday a clear view of the status in preparation of HI data taking

@germanfgv
Copy link
Contributor

https://github.com/cms-sw/cmssw/releases/tag/CMSSW_14_1_2 is now available with the known needed fixes. I suggest to update the branch above, and restart a replay before the week-end, so as we may have at the Joint Ops on Monday a clear view of the status in preparation of HI data taking

We will trigger the replay again, now with this new version. It should be done by Monday.

@LinaresToine
Copy link
Contributor Author

The replay has started, it can be followed in

CMSSW_14_1_2 Replay

@fabiocos
Copy link
Contributor

The replay is successfully finished, both according to Grafana dashboard and the testbed WMStats instance

@LinaresToine
Copy link
Contributor Author

LinaresToine commented Oct 12, 2024

Output data for replay can be found in DAS

@LinaresToine LinaresToine mentioned this pull request Oct 15, 2024
@LinaresToine LinaresToine force-pushed the HITesting branch 3 times, most recently from 9cc4001 to b2c02e3 Compare October 17, 2024 19:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants