First Source

استخدام Senior Data Engineer

First Source
تهران

فرصت شغلی
درباره شرکت

تکنولوژی‌ها

    PythonGcpSQLPostgreSQL

The team

We are the Business Intelligence team responsible for cultivating a data-driven culture. We manage and use data and analytics to build better products and services based on a deep understanding of our consumers. We play a huge role in driving intelligent marketing decisions, optimizing our client's business, and increasing profitability.

The role

As a Senior Data Engineer at First Source, you will focus on shaping our future data models. You will provide business intelligence solutions by leveraging technologies such as Google Cloud Platform, Airflow, Python, Docker, and PostgreSQL. Primary responsibilities will include developing, Testing, and maintaining architectures for data processing and building Extract, Transform, and Load (ETL) pipelines. We will rely on your experience and skills to ensure data accuracy and greater functionality of our data warehouse.


Challenges

● Steer a wide range of data engineering responsibilities such as maintenance of data integrity, accuracy, Security, and systematic storage for easy access.
● Extract data from in-house and third-party systems and other complex sources.
● Adhere to best practices while building a secure and scalable company data warehouse and Pipeline to support Data Science projects.
● Improve the efficiency of the pipelined ETL system by resolving Logging errors and recommending improvements.
● Utilise your experience and data engineering knowledge to develop business solutions.
● Keep up-to-date on company products and new releases to efficiently plan changes on data pipelines or warehouses.

Minimum requirements

● A minimum of 5 years of experience in the data engineering field
● Well-versed in data modelling techniques such as Kimball star schema, Anchor modelling, and Data vault
● Experience in object-oriented or object function scripting languages such as Python
● Deft in relational SQL and NoSQL databases, preferably with PostgreSQL, PITR, Pg_basebackup, WAL archival, and Replication
● Familiarity with column-oriented storage or data warehouses such as parquet, Redshift, and Bigquery
● Comprehensive experience in developing and maintaining ETL/ELT data pipelines and workflow management tools such as Airflow
● Prior experience with Google Cloud Platforms (GCP) such as Google Bigquery, Scheduled queries, and Google Cloud Storage and Google Cloud functions
● Familiar with alerting and self-recovery methods concerning data accuracy
● Ability to transform data into optimal business decisions
● Experience in peer reviewing Pipeline codes and suggesting improvements when required, and helping teams make informed business decisions with data
● Strong presentation skills
● Excellent spoken and written English communication skills


Preferred experience

● Prior experience in cybersecurity and data protection
● Exposure to data Pipeline and workflow management tools such as Luigi
● Exposure to maintaining and Monitoring Database health and resolving errors
● Experience in managing stakeholders’ expectations and in technical requirements gathering?
● Familiarity with container technologies such as Docker
● A fintech background

مزایا

  • Performance-based bonuses
  • Competitive salary
  • Health and wellness benefits
  • Education assistance
  • Onsite meals and snacks
  • Flextime