Data Analyst (Technical)
Full-Time
/
/
/
Egypt - Remote
/
/
/
the Role:
Our client is looking for a technically strong Data Analyst to help transform spreadsheet-based reporting into scalable, automated data workflows. This role is ideal for someone who can combine analytical thinking with Python development to streamline data processes and support business insights.
Key Responsibilities:
Convert Google Sheets logic and formulas into clean, documented Python scripts.
Manipulate, analyze, and merge datasets using CSV files and data libraries.
Develop automated workflows and pipelines using tools like Pandas or PySpark.
Collaborate with data and product teams to ensure accuracy and scalability of data outputs.
Generate reports in CSV format to match defined outputs.
Optionally, perform exploratory analysis and build visualizations using Seaborn or similar tools.
Maintain a GitHub repository with well-documented scripts, output files, and optional insights.
Requirements:
Proficiency in Python and libraries such as Pandas (or PySpark).
Solid experience working with CSV data and spreadsheet logic.
Ability to write clean, reproducible code and document the process clearly.
Understanding of version control and working with Git/GitHub.
Strong analytical skills and attention to detail.
Ability to work independently and translate business needs into data workflows.
Preferred Qualifications:
Experience with Airflow, Databricks, or similar workflow/orchestration tools.
Familiarity with data visualization libraries such as Seaborn, Matplotlib, or Plotly.
Knowledge of ETL/ELT best practices.
Background in a technical or data-focused role (Analytics, Engineering, etc.).
Bachelor’s degree in Data Science, Computer Science, or related field.
the Role:
Our client is looking for a technically strong Data Analyst to help transform spreadsheet-based reporting into scalable, automated data workflows. This role is ideal for someone who can combine analytical thinking with Python development to streamline data processes and support business insights.
Key Responsibilities:
Convert Google Sheets logic and formulas into clean, documented Python scripts.
Manipulate, analyze, and merge datasets using CSV files and data libraries.
Develop automated workflows and pipelines using tools like Pandas or PySpark.
Collaborate with data and product teams to ensure accuracy and scalability of data outputs.
Generate reports in CSV format to match defined outputs.
Optionally, perform exploratory analysis and build visualizations using Seaborn or similar tools.
Maintain a GitHub repository with well-documented scripts, output files, and optional insights.
Requirements:
Proficiency in Python and libraries such as Pandas (or PySpark).
Solid experience working with CSV data and spreadsheet logic.
Ability to write clean, reproducible code and document the process clearly.
Understanding of version control and working with Git/GitHub.
Strong analytical skills and attention to detail.
Ability to work independently and translate business needs into data workflows.
Preferred Qualifications:
Experience with Airflow, Databricks, or similar workflow/orchestration tools.
Familiarity with data visualization libraries such as Seaborn, Matplotlib, or Plotly.
Knowledge of ETL/ELT best practices.
Background in a technical or data-focused role (Analytics, Engineering, etc.).
Bachelor’s degree in Data Science, Computer Science, or related field.
the Role:
Our client is looking for a technically strong Data Analyst to help transform spreadsheet-based reporting into scalable, automated data workflows. This role is ideal for someone who can combine analytical thinking with Python development to streamline data processes and support business insights.
Key Responsibilities:
Convert Google Sheets logic and formulas into clean, documented Python scripts.
Manipulate, analyze, and merge datasets using CSV files and data libraries.
Develop automated workflows and pipelines using tools like Pandas or PySpark.
Collaborate with data and product teams to ensure accuracy and scalability of data outputs.
Generate reports in CSV format to match defined outputs.
Optionally, perform exploratory analysis and build visualizations using Seaborn or similar tools.
Maintain a GitHub repository with well-documented scripts, output files, and optional insights.
Requirements:
Proficiency in Python and libraries such as Pandas (or PySpark).
Solid experience working with CSV data and spreadsheet logic.
Ability to write clean, reproducible code and document the process clearly.
Understanding of version control and working with Git/GitHub.
Strong analytical skills and attention to detail.
Ability to work independently and translate business needs into data workflows.
Preferred Qualifications:
Experience with Airflow, Databricks, or similar workflow/orchestration tools.
Familiarity with data visualization libraries such as Seaborn, Matplotlib, or Plotly.
Knowledge of ETL/ELT best practices.
Background in a technical or data-focused role (Analytics, Engineering, etc.).
Bachelor’s degree in Data Science, Computer Science, or related field.
the Role:
Our client is looking for a technically strong Data Analyst to help transform spreadsheet-based reporting into scalable, automated data workflows. This role is ideal for someone who can combine analytical thinking with Python development to streamline data processes and support business insights.
Key Responsibilities:
Convert Google Sheets logic and formulas into clean, documented Python scripts.
Manipulate, analyze, and merge datasets using CSV files and data libraries.
Develop automated workflows and pipelines using tools like Pandas or PySpark.
Collaborate with data and product teams to ensure accuracy and scalability of data outputs.
Generate reports in CSV format to match defined outputs.
Optionally, perform exploratory analysis and build visualizations using Seaborn or similar tools.
Maintain a GitHub repository with well-documented scripts, output files, and optional insights.
Requirements:
Proficiency in Python and libraries such as Pandas (or PySpark).
Solid experience working with CSV data and spreadsheet logic.
Ability to write clean, reproducible code and document the process clearly.
Understanding of version control and working with Git/GitHub.
Strong analytical skills and attention to detail.
Ability to work independently and translate business needs into data workflows.
Preferred Qualifications:
Experience with Airflow, Databricks, or similar workflow/orchestration tools.
Familiarity with data visualization libraries such as Seaborn, Matplotlib, or Plotly.
Knowledge of ETL/ELT best practices.
Background in a technical or data-focused role (Analytics, Engineering, etc.).
Bachelor’s degree in Data Science, Computer Science, or related field.