Job Description
- Obtain data sets that match the needs of the business in question.
- Design, implementation, and testing of data pipeline architecture.
- Implement new data validation methods and data analysis tools.
- Ensure compliance with data usage security policies.
- Develop services based on massive data.
- Collaborate with the management team to understand the company's goals.
Requirements
- Familiarity with Data Warehouse/Data Lake design architectures.
- Familiarity with Kafka/RabbitMq, Spark, Airflow, Nifi, and Superset is an advantage.
- Ability to interact and communicate effectively with technical and analysis teams.
- Familiarity with financial markets.
- Bachelor's or Master's degree in Industrial Engineering or Software Engineering.
- At least three years of relevant work experience.
- Proficient in Python, Docker, MongoDB, Redis, and SQL Server.
- Teamwork spirit
- Ability to communicate effectively and exchange ideas with people.
- Ability to follow through to achieve results.
- Regular and responsible.
- Flexible, and agile.
To see more jobs that fit your career