-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update fix/clean up #4292
Update fix/clean up #4292
Conversation
Updates from airqo staging
Updates from airqo staging
📝 WalkthroughWalkthroughThe pull request introduces refinements to exception handling, method signatures, and Airflow DAG configurations across multiple files. The changes focus on improving error logging, input validation, and task execution robustness in the AirQo ETL (Extract, Transform, Load) workflows. Key modifications include updating the Changes
Possibly related PRs
Suggested Reviewers
Poem
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (2)
src/workflows/airqo_etl_utils/airqo_utils.py (2)
512-512
: Address the TODO comment regarding raw data quality checks.The comment indicates a need for a more structured approach to raw data quality checks.
Would you like me to help design a more robust implementation for the raw data quality checks?
513-520
: Consider adding error handling to the pattern matching block.While the pattern matching implementation is clean, it would benefit from error handling to gracefully handle unexpected device categories.
match device_category: case DeviceCategory.LOW_COST_GAS: - AirQoGxExpectations.from_pandas().gaseous_low_cost_sensor_raw_data_check( - data - ) + try: + AirQoGxExpectations.from_pandas().gaseous_low_cost_sensor_raw_data_check( + data + ) + except Exception as e: + logger.warning(f"Raw data quality check failed for LOW_COST_GAS: {e}") case DeviceCategory.LOW_COST: - AirQoGxExpectations.from_pandas().pm2_5_low_cost_sensor_raw_data(data) + try: + AirQoGxExpectations.from_pandas().pm2_5_low_cost_sensor_raw_data(data) + except Exception as e: + logger.warning(f"Raw data quality check failed for LOW_COST: {e}") + case _: + logger.warning(f"Unexpected device category: {device_category}")
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/workflows/airqo_etl_utils/airqo_utils.py
(1 hunks)
🧰 Additional context used
🪛 Ruff (0.8.2)
src/workflows/airqo_etl_utils/airqo_utils.py
525-525: Within an except
clause, raise exceptions with raise ... from err
or raise ... from None
to distinguish them from errors in exception handling
(B904)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: Analyze (python)
- GitHub Check: Analyze (javascript)
🔇 Additional comments (1)
src/workflows/airqo_etl_utils/airqo_utils.py (1)
488-507
: Well-documented function with clear steps!The expanded docstring provides excellent clarity about the function's purpose and process. The step-by-step breakdown and proper type hints make it very maintainable.
try: | ||
data.dropna(subset=["timestamp"], inplace=True) | ||
data["timestamp"] = pd.to_datetime(data["timestamp"]) | ||
data.dropna(subset=["timestamp"], inplace=True) | ||
except Exception as e: | ||
logger.exception(f"There is an issue with the timestamp column: {e}") | ||
raise KeyError(f"An error has occurred with the 'timestamp' column: {e}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve exception handling by using raise from
.
The current exception handling could be improved to better preserve the exception chain.
try:
data.dropna(subset=["timestamp"], inplace=True)
data["timestamp"] = pd.to_datetime(data["timestamp"])
except Exception as e:
logger.exception(f"There is an issue with the timestamp column: {e}")
- raise KeyError(f"An error has occurred with the 'timestamp' column: {e}")
+ raise KeyError(f"An error has occurred with the 'timestamp' column: {e}") from e
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
try: | |
data.dropna(subset=["timestamp"], inplace=True) | |
data["timestamp"] = pd.to_datetime(data["timestamp"]) | |
data.dropna(subset=["timestamp"], inplace=True) | |
except Exception as e: | |
logger.exception(f"There is an issue with the timestamp column: {e}") | |
raise KeyError(f"An error has occurred with the 'timestamp' column: {e}") | |
try: | |
data.dropna(subset=["timestamp"], inplace=True) | |
data["timestamp"] = pd.to_datetime(data["timestamp"]) | |
except Exception as e: | |
logger.exception(f"There is an issue with the timestamp column: {e}") | |
raise KeyError(f"An error has occurred with the 'timestamp' column: {e}") from e |
🧰 Tools
🪛 Ruff (0.8.2)
525-525: Within an except
clause, raise exceptions with raise ... from err
or raise ... from None
to distinguish them from errors in exception handling
(B904)
Description
Enable task retries for weather data.
Summary by CodeRabbit
Release Notes
Bug Fixes
Refactor
Documentation
The changes focus on improving code reliability, error handling, and workflow consistency across various data processing components.