توضیحات فرصت شغلی
About the Role
We are seeking a Mid-Level / Senior Data Engineer to Design, implement, and maintain scalable data pipelines and storage solutions that enable company-wide data-driven decision-making. You'll manage real-time analytics systems, collaborate with cross-functional teams, and ensure efficient, reliable data flow.
Technical Skills
- Programming Languages: Python, Java or Scala
- Database Technologies: SQL, NoSQL (e.g., MongoDB, Cassandra)
- Data Processing Frameworks: Apache Spark, Hadoop
- Data Warehousing: Amazon Redshift, Google BigQuery, Snowflake
- ETL Tools: Apache Airflow, Apache Nifi, Talend, Informatica
- Cloud Platforms: AWS, Azure, Google Cloud Platform
- Version Control: Git, GitHub, Gitlab
Responsibilities:
- Design, develop, and maintain scalable data pipelines using Apache Spark and other Big Data technologies.
· Build and maintain data architectures on Hadoop or similar distributed file systems.
· Collaborate with cross-functional teams to identify, Design, and implement data-driven solutions to complex business problems.
· Optimize data systems for maximum performance and scalability.
· Develop and manage real-time analytics systems, ensuring their reliability, performance, and maintenance.
· Propose and refine data Architecture to meet evolving business needs.
· Collaborate with Business Intelligence, Ventures, and Data Science teams to ensure their data requirements are met.
· Monitor and troubleshoot data services, resolving any issues that arise.
· Set up real-time analytics solutions tailored to specific services and business demands.
· Ensure highly efficient data pipelines by identifying and fixing performance bottlenecks.
· Design, implement and maintain data infrastructure to ensure steady and undisrupted data flow.
Job requirements:
· Bachelor's or Master's degree in Computer Engineering/Science or equivalent experience.
- 2+ years of experience in data engineering or a related field.
· Expertise in designing and maintaining scalable data pipelines and Big Data systems.
· Proficiency in Hadoop ecosystem (HDFS, Yarn, Hive, Spark).
· Hands-on experience with Kafka and Zookeeper for data Streaming and coordination.
· Strong programming skills in Python, Java, Scala, or Go (minimum 2 years of experience).
· Familiarity with Monitoring systems such as Grafana, Prometheus, and Exporters.
· Experience working with Linux, virtualization, Docker, and Kubernetes.
· Proven experience in setting up and maintaining real-time analytics and Big Data systems.
· Hands-on experience with Big Data technologies such as Pig, Kafka, and NoSQL databases.
- Strong communication skills and ability to work collaboratively in a team environment.
- Excellent problem-solving skills and attention to detail.
Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI, Metabase).
تکنولوژیها
- PythonMongoDBGitGitlabMql
بیت۲۴ یک سامانه خرید و فروش ارزهای دیجیتال برای مبادله داراییهای دیجیتال و رمز ارزهای بیتکوین، اتریوم، ریپل، لایتکوین، بیتکوین کش، تتر و سایر ارزهای دیجیتال با پول رایج ایران (ریال) است که با هدف حذف واسطهها باعث سهولت و امنیت در خرید و فروش شده و همچنین امکان خرید و فروش 24 ساعته را برای مشترکین بهوجود میآورد. تیم بیت۲۴ تما تلاش خود را میکند تا بهترین قیمتهای خرید و فروش ارزهای دیجیتال را ارائه دهد و رضایت شما کاربران عزیز را کسب کند.
اطلاعات تماس
تهران - تهران گیشا (کوی نصر) نبش خیابان شانزدهم پلاک 145
مزایا
- بیمه
- صبحانه
- کمک هزینه ناهار
- ساعت کاری منعطف
- امریه
- تسهیلات آموزشی
- بیمه تکمیلی
- بودجه تیم سازی
- هدایای مناسبتی
- رویدادهای تفریحی
- اتاق بازی