addressalign-toparrow-leftarrow-rightbackbellblockcalendarcameraccwcheckchevron-downchevron-leftchevron-rightchevron-small-downchevron-small-leftchevron-small-rightchevron-small-upchevron-upcircle-with-checkcircle-with-crosscircle-with-pluscontroller-playcredit-cardcrossdots-three-verticaleditemptyheartexporteye-with-lineeyefacebookfolderfullheartglobe--smallglobegmailgooglegroupshelp-with-circleimageimagesinstagramFill 1languagelaunch-new-window--smalllight-bulblinklocation-pinlockm-swarmSearchmailmessagesminusmobilemoremuplabelShape 3 + Rectangle 1ShapeoutlookpersonJoin Group on CardStartprice-ribbonprintShapeShapeShapeShapeImported LayersImported LayersImported Layersshieldstartickettrashtriangle-downtriangle-uptwitteruserwarningyahoo

HUGNOFA Message Board › Hadoop Sr. Big Data Architect Position 1099 or C2C with a Great Company!

Hadoop Sr. Big Data Architect Position 1099 or C2C with a Great Company! Can be based anywhere in the United States. Travel Monday - Thursday (Home for the weekends!) Great pay!

Jessica J.
user 9861896
Fort Worth, TX
Post #: 4
Candidates must be able to work for any employer in the United States. The position would be corp-to-corp or 1099; travel would be Mon-Thurs (Home on the weekends) primarily to DC, Tampa, and Dallas. Engagements are 2+ months onsite with clients. This position is working on-site with the end client doing implementations and additional hands-on work with system integration, strategy, ECT. This is a purely technical role. I think you will be very interested in position after hearing who our client is (they are very well known in the Big Data arena).

If you are interested in the position, or know of anyone else that may be a good fit for the position, please email your resume, contact information, and hourly rate to:

Short Description of the job:

• Design and implement Hadoop architectures and configurations for customer
• Partner with customers and partners to optimize plans and objectives for Architecting, designing and deploying Apache Hadoop environments, and assist in building or designing reference configurations to enable our customers.
• Drive projects with customers to successful completion
• Write and produce technical documentation, knowledgebase articles


• More than five years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
• 2+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
• Ability to understand and translate customer requirements into technical requirements
• Experience implementing data transformation and processing solutions using Apache PIG
• Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
• Experience implementing MapReduce jobs
• Experience setting up multi-node Hadoop clusters
• Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
Powered by mvnForum

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy