Detalles Equipo Calendario Documento FAQ
Challenge

Developer

Ranking: 1

The client is getting in touch with us looking to identify roels for a potential new opportunity. In this case he is asking for Big Data developer with below description: Big data developer: * Hadoop administration/developer with Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark/Scala, Hive, Kafka, YARN, and ZooKeeper. * Shold be good in Docker and Container concepts in redhat openshift environment * Good knowledge of Oracle and SQL * Good knowledge of database structures, theories, principles, and practices * Performance & Resource Manement * Importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa * Developing shell/python scripts to transform the data in HDFS * Cloudera Navigator Data Manement Component Administration * Create a Multitenant Enterprise Data Hub * Good knowledge in back-end programming, specifically java * Writing high-performance, reliable and maintainable code * Ability to write MapReduce jobs * Hands on experience in HiveQL * Familiarity with data loading tools like Flume, Sqoop * Knowledge of workflow/schedulers like Oozie, CronTab * Analytical and problem solving skills, applied to Big Data domain * Good aptitude in multi-threading and concurrency concepts * Knowledge of Payments and trades processing life cycle is a plus

Developer

Ranking: 1

The client is getting in touch with us looking to identify roels for a potential new opportunity. In this case he is asking for Big Data developer with below description: Big data developer: * Hadoop administration/developer with Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark/Scala, Hive, Kafka, YARN, and ZooKeeper. * Shold be good in Docker and Container concepts in redhat openshift environment * Good knowledge of Oracle and SQL * Good knowledge of database structures, theories, principles, and practices * Performance & Resource Manement * Importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa * Developing shell/python scripts to transform the data in HDFS * Cloudera Navigator Data Manement Component Administration * Create a Multitenant Enterprise Data Hub * Good knowledge in back-end programming, specifically java * Writing high-performance, reliable and maintainable code * Ability to write MapReduce jobs * Hands on experience in HiveQL * Familiarity with data loading tools like Flume, Sqoop * Knowledge of workflow/schedulers like Oozie, CronTab * Analytical and problem solving skills, applied to Big Data domain * Good aptitude in multi-threading and concurrency concepts * Knowledge of Payments and trades processing life cycle is a plus

  • Equipo
  • Evaluador
  • Manager
  • Agencia
  • Cliente

users21

Evaluador

Comentarios: 10

users296

Agencia

Comentarios: 0