Big Data Gurus Message Board › FT Job Opportunity in Fremont CA: Full time java hadoop unix engineer

FT Job Opportunity in Fremont CA: Full time java hadoop unix engineer

molly m.
Molly_McClure
New York, NY
Post #: 52


FT Job Opportunity in Fremont CA: Full time java hadoop unix engineer
Please send resume to molly at fremontconsulting dot com; please send me an invite to connect on linkedin!
Details:

Fremont Consulting is a premier recruiting and staffing firm in Elk Grove, CA

Follow us on twitter for more job opportunities: @fremontflash

This is a full time role with one of our clients: a software firm
Here is the target profile so you can see if you or someone you know is a close fit.: strong java and hadoop developer who also has UNIX skills as well.


Big Data Senior Software Engineer
Software Products and services Firm
Location : Fremont, CA
Employee Type : Full-Time
Industry : High Tech
Travel : extensive
Big Data Sr. Software Engineer – Hadoop, Web API, Parallel Programming
This role requires strong programming skills, an understanding of big data and parallelization and a true passion for massive-scale computing. You will work with complex distributed systems and service oriented architecture, while building services and components for solving large scale problems in many domains.
The ideal candidate will have worked deeply with service oriented architecture and distributed systems, and have a great understanding of the problems involved.
Key Responsibilities: The Big Data Sr. Software Engineer will have a strong level of experience in the following:
• Development experience with database technologies in a java, hadoop and unix environment
• Developing Technical Specifications and Architecture Blueprints for Data Warehousing and Analytics/BI infrastructures.
• Translation of complex functional and technical requirements into detailed architecture and design.
• Being very hands-on; working with the engineering team to manage the day-to-day development activities by leading architecture decisions, participating in designs, design review, code review, and implementation.
• Responsible for the over-all systems architecture, scalability, reliability, and performance.
• Responsible for real-time operational support of the team's functional areas.
• Maintaining current technical knowledge to support rapidly changing technology, always on a look out for new technologies and work with management and development team in bringing new technologies.
• Work with a minimum of technical supervision and supplemental engineering support, while responding efficiently to multiple program priorities.
The ideal candidate will be a proponent for innovation, best practices, sound design with data & information optimization in mind, strong development habits, and efficient team/project structures. Quintiles Information Technology offers a world class, global organization with ample career development and opportunity. We are seeking change agents that are highly motivated and enthusiastic to join our talented team!

Requirements
MINIMUM REQUIRED EDUCATION AND EXPERIENCE:
1. 5+ years development experience preferably in web and data platforms
2. Ability to support multiple concurrent projects
3. Proven experience in large data volume
4. Experience with the Microsoft stack essential but ability to be able to learn and use any appropriate platform for the task
5. Degree or appropriate experience required
6. Excellent Technical skills are essential for this role
7. Ability to quickly analyze situations and produce sound architecture to scale and be reusable.
8. Proficiency with object-oriented design, data structures, and algorithms
9. Strong debugging, troubleshooting, and problem solving skills
Preferred Qualifications
1. Proven track record on delivering results, especially in the areas of writing high-performance, reliable and maintainable code.
2. Ability to adapt to new development environments, changing business requirements and learning new systems highly desired.
3. Core competencies in Java, UNIX, Hadoop, MapReduce and Hadoop Distributed File System HDFS .
4. Strong knowledge of data structures, parallel computing algorithms, enterprise systems, and asynchronous architectures.
5. Understanding of web services software architectural and design issues.
6. Experience with Hadoop, Hbase and other cloud computing technologies (strong plus).
7. Background with traditional databases, ETL, and data warehousing (plus).
8. Exposure to data workflow and/or scheduling systems (plus).
9. Experience with large distributed services is a plus as is building/operating highly available systems.
10. Ability to work well in a team environment and effectively drive cross-team solutions that have complex dependencies and requirements.
11. Ability to handle multiple competing priorities in a fast-paced environment.
12. Excellent verbal and written communication skills.
--------------------------------------­-----------------------
keywords: UNIX, java, Apache Hadoop “platform” MapReduce and Hadoop Distributed File System (HDFS), Apache Hive, Apache HBase, the Hadoop Common , Java ARchive (JAR), JobTracker, TaskTracker, NameNode and DataNode. Java Runtime Environment (JRE) ; Secure Shell (ssh) Java Virtual Machine JVM, Amazon Elastic Compute Cloud (EC2) and Amazon Simple Storage Service (S3), Map/Reduce,HDFS-RAID Erasure Coding in HDFS, HBase


please send me an invite to connect on linkedin!

Follow us on twitter for more job opportunities: @fremontflash

I also welcome inquires from hiring managers. Thank you.
Powered by mvnForum

Our Sponsors

  • Oreilly

    20% Discount for Strata Conference (code UGBDG20)

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy