- We are looking for new technologies and a desired to change the way business gets done. - An experienced head of big data needed to architect big data clusters and establish big data roadmap and policies. The candidate must be able to: - Lead cross-functional teams, testing teams and perform in accordance with data scientists and business intelligence squad.
- BSc/MSc in Computer Science or similar technical degree - Building and coding applications using Hadoop components - HDFS, Hbase, Hive, Sqoop, Flume - Data Flow Programming, including - MapReduce coding using tools such as Java, Python, Pig, Hadoop Streaming and HiveQL - Implementing relational and dimensional data models - Understanding of traditional and Big Data ETL/data movement tools & RDBMS e.g Kafka, Sqoop - Leading and delivering an operational Big Data solution using one or more of the following technologies: Hadoop, HortonWorks, Cloudera, Cassandra Outstanding skills: - Experience with cloud Big Data Infrastructure (Cloud or Data Center such as Amazon AWS and Google Cloud) - Industry experience (financial services, resources, healthcare, government, products, communications, high tech) - Data Science and Analytics (machine learning, analytical models, MAHOUT, etc.) - Data Visualization
Register for free Use
and speed up job searching