Prolifics creates competitive advantage for organizations around the world by implementing customized, end-to-end IT solutions that achieve business success, leveraging leading technologies in a global delivery model. For more than 35 years, our industry-specific insights and certified technology accelerators have transformed organizations around the world by solving complex IT challenges. We have a Global presence across North America, Europe & Asia. In India we have five off-shore development centers; four in Hyderabad and one in Pune. Prolifics is a CMMI Level 5 certified company. We collaboratively work with Technologies from IBM, Microsoft, Open Source and Packaged Application.
A Hadoop Developer is responsible for the actual coding or programming of Hadoop applications.
- Build distributed, reliable and scalable data pipelines to ingest and process data in real-time. Hadoop developer deals with fetching impression streams, transaction behaviours, clickstream data and other unstructured data.
- Develop efficient pig and hive scripts with joins on datasets using various techniques.
- Loading from disparate data sets.
- Pre-processing using Hive and Pig.
- Designing, building, installing, configuring and supporting Hadoop.
- Translate complex functional and technical requirements into detailed design.
- Perform analysis of vast data stores and uncover insights.
- Maintain security and data privacy.
- Create scalable and high-performance web services for data tracking.
- High-speed querying.
- Managing and deploying HBase.
- Being a part of a POC effort to help build new Hadoop clusters.
- Test prototypes and oversee handover to operational teams.
- Propose best practices/standards.
- Must have hands-on experience in Big Data Technologies i.e. Apache Hadoop, Map Reduce, Hive, PIG, Sqoop, Flume etc.
- Knowledge of Hadoop architecture and Hadoop Distributed File System, Hadoop ecosystem.
- Proficient in setting up and working with huge Hadoop clusters, cluster monitoring, and maintenance
- Must have hands on experience in modelling in predictive analysis
- The most obvious, knowledge of hadoop ecosystem and its components –HBase, Pig, Hive, Sqoop, Flume, Oozie, etc.
- Know-how on the java essentials for hadoop.
- Know-how on basic Linux administration
- Knowledge of scripting languages like Python or Perl.
- Data modelling experience with OLTP and OLAP
- Good knowledge of concurrency and multi-threading concepts.
- Understanding the usage of various data visualizations tools like Tableau, Qlikview, etc.
- Should have basic knowledge of SQL, database structures, principles, and theories.
- Basic knowledge of popular ETL tools like Pentaho, Informatica, Talend, etc.
Qualifications and Experience
Years of Experience
4 to 8 years
Any Graduation, B.E, B.Tech, MCA, M.Tech, M.S
Apache spark , Apache Kafka, Apache Storm/Zookeeper and Core Java, Hadoop yaran