The Company
This client is focused on creating a sustainable future for the next generation. They have the 4th largest utility-scale operations and management provider in the U.S. and one of the largest operating solar portfolios in the United States.
They are looking to bring on a Data Engineer to work remotely in EST who has a keen eye for analyzing business as well as product data. They should excel in a cross-functional team and have a passion for improving business as well as amazing customer service.
Responsibilities
- High proficiency in Python including pyspark, pandas, numpy, etc. for data wrangling and ETL to cloud platforms (AWS, Snowflake preferred).
- Build libraries and integration with API's of software as a service
- Implement and create big data pipelines, dashboards, quality controls, etc.
- Maintain and design ETL pipelines for data warehouses
- Design and implement big data architecture, data pipelines, dashboards, data
- Build automated scripts using Python
- Catch bottlenecks of data and create solutions to optimize data queries and improve data visualization
Minimum Qualifications
- BA in Computer Science, Engineering, or a related field
- Extensive Python experience
- Understanding of query logic and proficiency in SQL
Skills/Abilities
- Strong python programming skills with an emphasis on product development.
- Experience with data visualization application development using Python.
- Our Ideal Candidate must have experience in SQL, Python, Spark, Power BI, ETL, and Tableau.
- Experienced developing in cloud platforms such as Google Cloud Platform (preferred), AWS, Azure, or Snowflake at scale.
- Experience in designing data engineering solutions using open source and proprietary cloud data pipeline tools such as Airflow, Glue and Dataflow preferred.
- Excellent written and verbal communication skills for coordinating across teams.