This is an urgent position with our client and they are willing to hire ASAP. Kindly go through the requirement and if you are available and interested for this requirement please feel free to reach me at firstname.lastname@example.org or 302-401-1081.
Job Title: Principal Hadoop Software Engneer
Location: Santa Clara, CA
Job Type: Full Time Permanent
About Oracle Utilities (Opower)
At Oracle Utilities (Opower), we’re applying cutting-edge computer science to one of humanity’s greatest challenges: Energy. Our utility customers in the United States and abroad give us energy usage data for tens of millions of their customers, which we then analyze and aggregate using state-of-the-art tools such as Hadoop, HBase, and Spark. If you are a top-notch engineer looking for a fast-paced place to work while being surrounded by highly skilled and driven peers, then Opower is the place for you.
About the Job
The Opower Data Platform team is responsible for all of the big data infrastructure that powers our SaaS analytics platform. Our team manages the services that ingest hundreds of millions of smart meter data reads a day and other key customer data. We provide Hadoop clusters for running analytics and machine learning algorithms, maintain BI and reporting tools, and run web services to make all of this data available to developers inside and outside of Oracle.
We are looking for an expert big data engineer who can help drive the next version of our architecture to support new sources of data and new end-users.
Thanks and Regards,
Enormous Enterprise LLC
Consulting | Innovation | Management
Please share your resume email@example.com
Title: Technical Specialist
Job ID: ENR_603097
Duration: Full Time
Need Hadoop Administrator with 7+ exp. Skills Required: •Knowledge on Hadoop, Linux environment and HBase •Candidate should have a hand-on experience in Hive and Ozzie Problem solving skills Responsibilities of Big Data Administrator: a. Maintain security of Hadoop cluster b. Maintain HDFS c. Implement Hadoop Infrastructure d. Coordinate with the data delivery teams to set up new Hadoop users e. File System maintenance should be done f. Working with the application teams to install operating system and Hadoop updates g. Maintain Data and Security privacy h. Performance tuning of Hadoop cluster •Familiarity with troubleshooting of core java is required