You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've started OM version1.5.5 on docker, following this deployment guid: https://docs.open-metadata.org/latest/deployment/docker
but any time I try to add data profile or quality case, it shows error:
"Authentication failed for user [[email protected]] trying to access the Airflow APIs." and "Failed to find OpenMetadata - Managed Airflow APIs ..."
As the following pic:
I've configured my own airflow host to env variety
PIPELINE_SERVICE_CLIENT_ENDPOINT="..."
anything messing ? can OM supporting an external airflow host or must It using the ingestion docker provided by OM to enable doing QA and profiling action from OpenMetadata's UI ?
added two more screen shot below.
It works perfectly with external Airflow instance. We do it with no issues.
However, it seems that the the credentials you are using is not allowed to access your Airflow API or db. How is your authentication set up? Can you access Airflow API using curl?
I'm trying to connect to mwaa, and installed openmetadata managed API on it successfully on it. API can be curl and or requests in python.
but it need to get session and cookie before requests the APIs. like follwing code:
def get_session_info(region, env_name):
logging.basicConfig(level=logging.INFO)
try:
# Initialize MWAA client and request a web login token
mwaa = boto3.client('mwaa', region_name=region)
response = mwaa.create_web_login_token(Name=env_name)
# Extract the web server hostname and login token
web_server_host_name = response["WebServerHostname"]
web_token = response["WebToken"]
# Construct the URL needed for authentication
login_url = f"https://{web_server_host_name}/aws_mwaa/login"
login_payload = {"token": web_token}
# Make a POST request to the MWAA login url using the login payload
response = requests.post(
login_url,
data=login_payload,
timeout=10
)
# Check if login was succesfull
if response.status_code == 200:
# Return the hostname and the session cookie
return (
web_server_host_name,
response.cookies["session"]
)
else:
# Log an error
logging.error("Failed to log in: HTTP %d", response.status_code)
return None
except requests.RequestException as e:
# Log any exceptions raised during the request to the MWAA login endpoint
logging.error("Request failed: %s", str(e))
return None
except Exception as e:
# Log any other unexpected exceptions
logging.error("An unexpected error occurred: %s", str(e))
return None
def requestapi(region, env_name, dag_name):
"""
Triggers a DAG in a specified MWAA environment using the Airflow REST API.
Args:
region (str): AWS region where the MWAA environment is hosted.
env_name (str): Name of the MWAA environment.
dag_name (str): Name of the DAG to trigger.
"""
logging.info(f"Attempting to trigger DAG {dag_name} in environment {env_name} at region {region}")
# Retrieve the web server hostname and session cookie for authentication
try:
web_server_host_name, session_cookie = get_session_info(region, env_name)
if not session_cookie:
logging.error("Authentication failed, no session cookie retrieved.")
return
except Exception as e:
logging.error(f"Error retrieving session info: {str(e)}")
return
# Prepare headers and payload for the request
cookies = {"session": session_cookie}
json_body = {"dag_id": "mysql_omdb_ingestion"}
url = f"https://{web_server_host_name}/api/v1/openmetadata/status?dag_id=mysql_omdb_ingestion"
# Send the POST request to trigger the DAG
try:
response = requests.get(url, cookies=cookies)
# response = requests.post(url, cookies=cookies, json=json_body)
# Check the response status code to determine if the DAG was triggered successfully
print(response.status_code)
if response.status_code == 200:
print(f"DAG triggered successfully: {response.json()}")
return response
# logging.info(f"DAG triggered successfully: {response.json()}")
else:
logging.error(f"Failed to trigger DAG: HTTP {response.status_code} - {response.text}")
return response
except requests.RequestException as e:
logging.error(f"Request to trigger DAG failed: {str(e)}")
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi there,
I've started OM version1.5.5 on docker, following this deployment guid: https://docs.open-metadata.org/latest/deployment/docker
but any time I try to add data profile or quality case, it shows error:
"Authentication failed for user [[email protected]] trying to access the Airflow APIs." and "Failed to find OpenMetadata - Managed Airflow APIs ..."
As the following pic:
I've configured my own airflow host to env variety
PIPELINE_SERVICE_CLIENT_ENDPOINT="..."
anything messing ? can OM supporting an external airflow host or must It using the ingestion docker provided by OM to enable doing QA and profiling action from OpenMetadata's UI ?
added two more screen shot below.
Beta Was this translation helpful? Give feedback.
All reactions